I started looking at pipenv
and it seems to be pretty good. My only concern is, that most of my projects involve numpy
, scipy
and some other not-so-small libraries.
The current way manage my projects:
I have pyenv
and pyenv-virtualenv
installed. I have a few (currently 4) specific virtualenvs
that each cater to a type of project. The projects themselves have .pyenv-version
set, I have the autoload virtualenv feature of pyenv
enabled. If I need to share a project, I generate a requirements.txt
with pip freeze -l
from the virtualenv
.
So in my current setup, I have X
number of projects and Y, Y << X
number of virtualenvs
, all amounting to a few GB of harddisk space. Note that because of large libraries like numpy
each of the virtualenvs
are pretty big, around 700-900 MB.
My question:
As far as I understand, pipenv
will, by default create a virtualenv
for all of my projects, so the harddisk space taken up by my virtualenvs
would increase considerably. What I'm interested in is:
- is it possible to share
pipenv
environments across several projects, that use exactly the same dependencies? i.e. multiplepipenv
configs that load the samevirtualenv
? - if not, is it possible to generate
pipenv
config files from avirtualenv
I set up withpyenv
? i.e. I would not usepipenv
to actually run my projects, I would not create anyvirtualenvs
withpipenv
, but I would createpipenv
config files for sharing the project (in this case, probably along side arequirements.txt
as well).
edit: I rewrote most of the question to make it clearer.