Using Docker for a mail server [closed]
Asked Answered
C

2

20

I've been interested in docker for a while, but not jumped in yet. I have a need to set up a mail server, so thought maybe I could use this as a reason to learn more about docker. However, I'm unclear how to best go about it.

I've installed a mailserver on a VPS before, but not into multiple containers. I'd like to install Postfix, Dovecot, MySQL or Postgresql, and SpamAssassin, similar to what is described here:

https://www.digitalocean.com/community/tutorials/how-to-configure-a-mail-server-using-postfix-dovecot-mysql-and-spamassasin

However, what would be a good way to dockerize it? Would I simply put everything into a single container? Or would it be better to have MySQL in one container, Postfix in another, and additional containers for Dovecot and SpamAssassin? Or should some containers be shared?

Are there any HOWTOs on installing a mailserver using docker? If there is, I haven't found it yet.

Chafin answered 28/1, 2015 at 3:22 Comment(1)
after much research and extensive effort into rolling my own postfix server on a VPS inside docker I discovered this repo which is actively maintained - its a postfix server which runs by default inside docker to provide easy config setup for your own domain to send and receive email - I have been using this in prod for a few years and its solid - highly recommend you check into using github.com/tomav/docker-mailserverFriedcake
A
21

The point of Docker isn't containerization for containerization's sake. It is to put together things that belong together and separate things that don't belong together.

With that in mind, the way I would set this up is with a container for the MySql database and another container for all of the mail components. The mail components are typically integrated with each other by calling each other's executables or by reading/writing shared files, so it does not make sense to separate them in separate containers anyway. Since the database could also be used for other things, and communication with it is done over a socket, it makes more sense for that to be a separate container.

Airdrome answered 28/1, 2015 at 3:30 Comment(5)
Actually, if you've set it up well, the communication between components is LMTP with perhaps a bit of milter.Seigneury
@Moshe, I understand your point, but I've also got other advice. One person suggested that if the MTA internally calls other executables, it might be best to keep them in the same image. But if those executables run as their own command in their own process space, they should be in different images. Another person suggested running everything in different containers, so mysql, postfix, dovecot, spamassassin, etc would all be separate containers. He suggested using --link in that case. This gives the benefit of swapping out each piece without affecting the others.Chafin
@Chafin If you look around long enough, you'll eventually collect every possible answer to this kind of question. Go with whatever works best for you.Airdrome
@Moshe agree with you, and thanks for your opinionChafin
There sometimes are good reasons to separate things into multiple containers, but the OP didn't give enough context to assess that one way or the other.Seigneury
S
19

Dovecot, Spamassassin, et al can go in separate containers to postfix. Use LMTP for the connections and it'll all work. This much is practical.

Now for the ideological bit. If you really wanted to do things 'the docker way', what would that look like.

Postfix is the difficult one. It's not one daemon, but rather a cluster of different daemons that talk to each other and do different parts of the mail handling tasks. Some of the interaction between these component daemons is via files (e.g the mail queues), some is via sockets, and some is via signals.

When you start up postfix, you really start the 'master' daemon, which then starts the other daemon processes it needs using the rules in master.cf.

Logging is particularly difficult in this scenario. All the different daemons independently log to /dev/log, and there's really no way to process those logs without putting a syslog daemon inside the container. "Not the docker way!"

Basically the compartmentalisation of functionality in postfix is very much a micro-service sort of approach, but it's not based on containerisation. There's no way for you to separate the different services out into different containers under docker, and even if you could, the reliance on signals is problematic.

I suppose it might be possible to re-engineer the 'master' daemon, giving it access to the docker process in the host, (or running docker within docker), and thus this new master daemon could coordinate the various services in separate containers. We can speculate, but I've not heard of anyone moving on this as an actual project.

That leaves us with the more likely option of choosing a more container friendly daemon than postfix for use in docker. I've been using postfix more or less exclusively for about the past decade, and haven't had much reason to look around options till now. I'd be very interested if anyone can add commentary on possible more docker-friendly MTA options?

Seigneury answered 7/5, 2015 at 16:16 Comment(2)
...and this might be why there is no "official" postfix docker image, I have been using github.com/catatnight/docker-postfixGlennisglennon
I was using catatnight's image also for simple things like outbound relays, but it hasn't been maintained in years. IT doesn't need much changed, but it does need to be built on a more recent operating system with security patches still being issued.Seigneury

© 2022 - 2024 — McMap. All rights reserved.