Feeding logstash from azure web app. How?
Asked Answered
A

2

8

I have a web app hosted on the azure platform and an ELK stack hosted on a virtual machine also in azure (same subscription) and am struggling to find a way to ship the logs from the app to logstash.

A web app stores all its files on a storage only accessible via FTP which logstash does not have an input plugin for.

What do people use to ship logs to ELK from web apps? If it was running as a VM I would use NXlog but that's not possible for a Web app.

I also use Log4Net and tried a UDP forwarder which worked on my local ELK stack but not the azure hosted one despite me adding the public UDP endpoint.

Adipocere answered 17/11, 2015 at 10:0 Comment(2)
"A web app stores all its files on a storage only accessible via FTP " - This isn't true. Web apps are not limited to using their local storage. They are perfectly capable of working with Azure Storage (e.g. blobs), Azure File Service, databases, etc.Tanked
@DavidMakogon - That's true. I did look at blob storage as an option but it isn't obvious as to how to get server logs and log4net files to save to them. I am wondering if anyone has done it and configured logstash to use blob storage as an input feed?Adipocere
N
7

Currently i am using Serilog to push my application log messages (in batch) towards a Redis queue which in turn is read by Logstash to enrich them and push them into Elasticsearch. This results in a reliable distributed setup which does not lose any application logs unless the redis max queue length is exceeded. Added bonus; Serilog emits json so your logstash config can stay pretty simple. Example code can be found here: https://gist.github.com/crunchie84/bcd6f7a8168b345a53ff

Needleful answered 3/12, 2015 at 14:22 Comment(2)
Hi Mark, why don't you directly pass Serilog log to Elasticsearch? From Serilog to Redis to Logstash to ES look complicated. check this git: github.com/serilog/serilog-sinks-elasticsearchEraeradiate
it was a matter of decoupling; azure had hosted redis and it is very fast to ingest data. elasticsearch was not a hosted solution and since it needs to process the submitted json could end up to be a bottleneck potentially influencing the log producing appNeedleful
M
3

Azure now has a project on GitHub called azure-diagnostics-tools that contains LogStash plugins for reading from blob and table storage, among other things.

Presumably, you can just enable diagnostics logging to blob/table storage for your WebApp and use the LogStash plugin to get them into ElasticSearch.

Regarding how to get log4net files into blob storage - you can use log4net.Appender.TraceAppender to write to the Trace system, which will cause it to be collected into Blob/Table storage when the option is enabled in the WebApp.

There's also an alternative to FTP for local files - you can use the Kudu REST VFS API to access (and modify) files by HTTP. I found this to be significantly faster and more reliable than FTP.

Macneil answered 10/5, 2016 at 18:12 Comment(1)
I ended up going with how Mark does it - but used Serilog to directly export to Elasticsearch. The Azure Ruby libraries are immature, and reading from Azure storage does not scale well.Macneil

© 2022 - 2024 — McMap. All rights reserved.