I'm using miniconda to manage my installation of data science packages. It is a workflow that I have somewhat established now, so I would like it to work in this case too. I also would assume it to work, since it is supposed to help in situations like this - where more dependencies than pure python is needed.
I would like to install the python cdt toolbox. It is a pip
-installable package available on PyPI but not in any conda channels. It requires PyTorch, available easily in PyPI and in their own conda channel. But it further requires some r-packages, most available on CRAN and some only on github. My dream scenario is to have a single environment.yml
file looking something like this:
name: my_env
channels:
- defaults
- pytorch
dependencies:
- pytorch
- cpuonly
- pip
- pip:
- cdt
- -e .
- r: # this line and below don't work...
- pcalg # available on CRAN, but not in conda channel r
- kpcalg # available on CRAN, but not in conda channel r
- github:https://github.com/Diviyan-Kalainathan/RCIT # R package not in CRAN nor on conda
I assume there is not a simple direct way to do it like above, based on this question on SO about similar npm installs. I cannot possibly be the first persion with a need to install both r and python packages... so what is the 'standard' workarounds?
pip
is sort of exceptional as per that other post. The canonical answer is to create Conda packages for what is not already there. Thepcalg
package is already available through the bioconda channel (under the namer-pcalg
). It should be straightforward to addkpcalg
to Bioconda - all its dependencies are already hosted there or on Conda Forge. The RCIT package only has standard dependencies, so you could possible build a Conda Forge recipe for it. – Lynx