I am using ELK stack for centralised logging from my Django server. My ELK stack is on a remote server and logstash.conf
looks like this:
input {
tcp {
port => 5959
codec => json
}
}
output {
elasticsearch {
hosts => ["xx.xx.xx.xx:9200"]
}
}
Both services elasticsearch and logstash are working (checked using docker-compose logs logstash
).
My Django server's settings file has logging configured as below:
LOGGING = {
'version': 1,
'handlers': {
'logstash': {
'level': 'INFO',
'class': 'logstash.TCPLogstashHandler',
'host': 'xx.xx.xx.xx',
'port': 5959, # Default value: 5959
'version': 0, # Version of logstash event schema. Default value: 0 (for backward compatibility of the library)
'message_type': 'django', # 'type' field in logstash message. Default value: 'logstash'.
'fqdn': True, # Fully qualified domain name. Default value: false.
'tags': ['django.request'], # list of tags. Default: None.
},
},
'loggers': {
'django.request': {
'handlers': ['logstash'],
'level': 'DEBUG',
},
}
}
I run my Django server and Logstash handler handles the logs as console shows no logs. I used the python-logstash
library in Django server to construct the above conf, but the logs are not sent to my remote server.
I checked through many questions, verified that services are running and ports are correct, but I have no clue why the logs are not being sent to Logstash.
fqdn
toFalse
? – Enculturation