Running with Multiple DRMs
Hi Galaxy-Dev, I have a new local instance that submits to cluster1 via pbs-drmaa-1.0.14. Galaxy is on a VM that is a submit node on cluster1, and there is a shared filesystem amongst all the clusters where the datasets are stored. The Galaxy VM is also a submit node on cluster2 and we have installed another Torque client on the VM in a different location to interface with that one. We have confirmed that using /usr/local/bin/qsub submits to cluster1 and using /N/softs/galaxy/torque/qsub submits to cluster2. The two clusters are using different versions of Torque, but our end goal is to find a way to make it possible to submit to condor and sge as well (as I understand, setting up a distributed Galaxy is a hot topic for multiple institutions). My question is then, how to go about submitting certain jobs (say, all blast jobs) to cluster2 from Galaxy? Here are my ideas so far: (1.) If I can change the path from the default Torque client to our new Torque client while a tool is being submitted, that should do the trick -- that way Galaxy and Drmaa don't need to know the details. I thought about doing this in 200_runners.py but I don't think it would work to just export the path. (2.) If I can get drmaa configured to submit using the correct qsub command given drmaa://-q queuename@host, that would work. Currently just trying the queue and hostname hasn't worked. (3.) If tool handlers can be configured to use different settings for running jobs, this might be workable. I looked into: http://wiki.galaxyproject.org/Admin/Config/Jobs That looks very promising but I don't see job_conf* anywhere in my new clone. (4.) Change the wrapper to include some other parameter returned by 200_runners.py as is suggested here: http://lists.bx.psu.edu/pipermail/galaxy-dev/2012-June/010080.html Not sure if this would do what I want. (5.) I may need to make changes to execute() and DefaultToolAction as alluded to here: http://lists.bx.psu.edu/pipermail/galaxy-dev/2012-June/009964.html This will require more time and energy to understand and modify than I have available at the moment. Are there any other insights into this sort of setup? Thanks! -Carrie Ganote
participants (1)
-
Ganote, Carrie L