Application is sending logs from many machines to Amazon Cloud and store them in some database.
> Lets assume: one machine log size: 1kB every 10 seconds, num of machines from
1000 to 5000
My first approach was to queue logs in rabbitmq and then rabbitmq consumer would store them in sql database.
- Do I really need rabbitmq when consumer only do some basic storage operation?
Second approach was to queue logs in rabbitmq but store them in mongodb
- Is this make sense to queue messages before write to mongodb?
sinks
at the producer system. Logging to the file system in addition to, or prior to, sending logs to a centralized database is a good idea, just in case something goes wrong when sending log data across the network - i.e. like the black box in an airline industry, if the file system survives a traumatic hardware failure or such, you still have some data to assist with post mortems. – Humanize