When you call perform_async
Sidekiq makes a Hash of the arguments, worker class, and other information the job will need. Then it serializes the Hash as JSON (collectively the "payload") and pushes it into Redis with lpush
. See A Tour of the Sidekiq API by the author of Sidekiq for more.
The only limits are those of Redis and Ruby's string sizes, and your Redis memory. For a Ruby String that's 2G if you're 32bit and you'll-run-out-of-memory-first for 64 bit.
For Redis that is 512M. Note that's 512M after the content is serialized as JSON. If it's mostly text this will be a small amount. If it's binary data, for example if you compress the text, it could be signficiantly larger.
What would be the recommended maximum size for a job payload?
As small as possible. Large payloads require expensive JSON serialization and de-serialization and consume both Redis and worker memory risking out of memory errors.
Instead of sending the content of a file, store the file somewhere the worker can access. This could be a shared disk, or an S3 Bucket. Send only what is needed for the worker to retrieve the file.
See Best Practices in the Sidekiq wiki.