Python/Django: sending emails in the background
Asked Answered
W

3

19

Imagine a situation in which a user performs an action on a website and admins are notified. Imagine there are 20 admins to notify. By using normal methods for sending emails with Django the user will have to wait until all the emails are sent before being able to proceed.

How can I send all the emails in a separate process so the user doesn't have to wait? Is it possible?

Wb answered 2/10, 2011 at 11:30 Comment(2)
Alternative (simple) solution could be to send an email to a Gmail (or other) address, which will then use a rule or something and send it on to all adminsScene
Sending to a Gmail address is not really a good solution at all. It would cause a lot of other problems, such as keeping the gmail-address in sync with the admins as they change. Gmail can also be unpredictably slow and/or have high latencies or even be down, which would cause a lot of unpredicted errors and slowness for the users.Attainture
A
25

Use celery as a task queue and django-celery-email which is an Django e-mail backend that dispatches e-mail sending to a celery task.

Attainture answered 2/10, 2011 at 11:40 Comment(3)
can I ask one thing? If I have understood correctly I would not need to change the code of my app to implement celery, except for my settings.py. Correct?Wb
As long as you are using the django.core.mail API you will not have to change anything in your code. The alternative email backend takes care of the celery integration. You can however, (easily) write other arbitrary celery tasks to be executed in the background, outside of the web process, which can be very handy.Attainture
I got OperationalError: [Errno 111] Connection refused in kombu.Cheeky
C
7

Another option is django-mailer. It queues up mail in a database table and then you use a cron job to send them.

https://github.com/pinax/django-mailer

Cryptology answered 20/3, 2013 at 10:47 Comment(0)
M
5

If we are talking about to send only 20 mails time by time, a thread may be a possible solution. For expensive background tasks use Celery.

This is a sample using thread:

# This Python file uses the following encoding: utf-8

#threading
from threading import Thread

...

class afegeixThread(Thread):
    
    def __init__ (self,usuari, parameter=None):
        Thread.__init__(self)
        self.parameter = parameter
        ...
      
    def run(self):        
        errors = []
        try:
             if self.paramenter:
                   ....
        except Exception, e:                
             ...
...

n = afegeixThread( 'p1' )
n.start()
Matthus answered 2/10, 2011 at 13:3 Comment(2)
This is possible, but the work is still done in a web server process, which is not ideal for background tasks. If you have the ability you should set up a proper task queue and off-load the web processes as much as possible.Attainture
I'm not sure exactly what you are asking, but celery is built for handling async workloads, outside of the web processess, and it is really easy and straightforward to get going, see the celery docs. It is usually a good practice to split up the heavy jobs into smaller jobs if possible. Celery can then be run on multiple machines and consume tasks in parallell, making it very easy to scale and handle heavy jobs! This can for instance be applied the video uploads, image resizing, sending e-mails, generating PDF:s or other similar heavy things!Attainture

© 2022 - 2024 — McMap. All rights reserved.