I'm setting up a Galaxy server which consists of a Galaxy head node and a Docker
engine (swarm) running on top of an OpenNebula cloud. So, ideally I would like to run
everything using this Docker engine.
For tools that only have 1 dependency, the Docker resolver works perfectly. But, in the
case of tools having more than 1 dependency, if I activate mulled containers
(enable_beta_mulled_containers = True) it will just try to run the tool script using the
container corresponding to the 1st dependency found, which of course fails.
What is the correct approach for resolving several dependencies using Docker?
I tried activating the involucro which (as far as I understood) builds the docker
container on-the-go based on the conda environment which results from the merge of all the
dependencies. I know it's a beta feature but still, couldn't make it
work...besides setting the path and auto_init to True, what am I supposed to put in the
Beside this, would it be possible to combine the dependencies in another way? Is it
possible in the current version to load the conda environment inside a
I know it may sound redundant and unnecessary in terms of dependency resolution but in
cases like I the one I described, where you have the Docker swarm already running.
I tried this with a few tools, using one of the containers as base and loading the rest of
the dependencies using conda environment. Would it be possible to do this in an automatic
way, using a general base container?
Anyway, the general question is where are things going regarding dependency resolution, so
that I can get an idea of what to expect in the future and how to collaborate with
development of these features. I found a thread on github about this ( [
] ) but don't know the resolutions
taken about it (if any).
Thanks in advance for the help.
Ignacio Eguinoa - Predoctoral fellow
Applied Bioinformatics And Biostatistics
VIB-UGent Center for Plant Systems Biology
Technologiepark 927 - 9052 Ghent - Belgium
Tel. +32(0)9 331 36 95