I don't think you should use celery, Cron still sounds fine to me in your case, but you might want to give Celery a try.
To me, Celery is a Python module for [asynchronous] [distributed] task queues. It allows you to dispatch lengthy tasks to multiple processes running on multiple machines (but one process on one machine is still fine). When you need to do something that takes time (like generate thumbnails, speak to an external API or generate complex reports), you can use Celery to do this in the background without blocking HTTP request for your user.
Some advantages of Celery over crontab:
- you can run tasks asynchronously, exactly when there is at least one celery worker free
- it does scale well to multiple processes / machines
- celerybeat is like crontab; but you can schedule tasks at given datetime or intervals using python syntax in your settings.py
- you can apply rate limits (for example for some sort of prioritization)
- there are monitoring tools like Flower which will give you a decent idea of what tasks have failed and what have succeeded
Some disadvantages of Celery:
- setup can take some time - you have to setup queue broker and demonize the workers in production; cron will already be there
- each worker process is likely to use ~same amount of RAM as your Django process, this can cost you $ or you may simply not have enough RAM to run Celery on, say, AWS free tier
And also, if it's just about sending e-mails, you could consider using a paid service such as Postmark (I am not affiliated to them), which will handle e-mail throttling for you.