You don't need to use virtualenv inside a Docker Container.
virtualenv is used for dependency isolation. You want to prevent any dependencies or packages installed from leaking between applications. Docker achieves the same thing, it isolates your dependencies within your container and prevent leaks between containers and between applications.
Therefore, there is no point in using virtualenv inside a Docker Container unless you are running multiple apps in the same container, if that's the case I'd say that you're doing something wrong and the solution would be to architect your app in a better way and split them up in multiple containers.
EDIT 2022: Given this answer get a lot of views, I thought it might make sense to add that now 4 years later, I realized that there actually is valid usages of virtual environments in Docker images, especially when doing multi staged builds:
FROM python:3.9-slim as compiler
ENV PYTHONUNBUFFERED 1
WORKDIR /app/
RUN python -m venv /opt/venv
# Enable venv
ENV PATH="/opt/venv/bin:$PATH"
COPY ./requirements.txt /app/requirements.txt
RUN pip install -Ur requirements.txt
FROM python:3.9-slim as runner
WORKDIR /app/
COPY --from=compiler /opt/venv /opt/venv
# Enable venv
ENV PATH="/opt/venv/bin:$PATH"
COPY . /app/
CMD ["python", "app.py", ]
In the Dockerfile
example above, we are creating a virtualenv at /opt/venv
and activating it using an ENV
statement, we then install all dependencies into this /opt/venv
and can simply copy this folder into our runner
stage of our build. This can help with minimizing docker image size.
source ../bin/activate
tried ? – Vmailpip
. Build a venv in your Docker image, and then use thepip
corresponding to the target virtualenv for installing packages into that virtualenv. If you call/path/to/venv/bin/pip
(note the the full venv path) you'll likely find success. – Revisionism