Hello Nate,
I have that version installed and was using it in the older versions of galaxy for a few years. Once I loaded this new version, it no longer worked with the old
definitions in the universe file using: default_cluster_job_runner = drmaa:///
Do I need a job_conf.xml that uses the drmaa runner?
On 3/5/14, 3:21 PM, Nate Coraor wrote:
Hi Pete,
The latest error is pretty strange and not one I've encountered before. It suggests that scramble is not loading setuptools in place of distutils and thus does not have access to the setuptools extensions (notably, egg-related functionality). Something abnormal still seems to be going on with your python environment.
You can use drmaa if you like (this is known to work well). You will want to use the libdrmaa for Torque that's maintained by the Poznan Supercomputing and Networking Center, rather than the libdrmaa that can be built directly with the Torque source. PSNC libdrmaa for Torque/PBS can be found here: http://apps.man.poznan.pl/trac/pbs-drmaa
--nate
On Wed, Mar 5, 2014 at 3:10 PM, Pete Schmitt <Peter.R.Schmitt@dartmouth.edu> wrote:
Is there any other alternatives to pbs_python for interfacing to a torque scheduler. This method appears to be a dead end.
On 3/4/14, 9:51 AM, Nate Coraor wrote:
Pete,
Is it possible that `python` as the Galaxy user is calling a python other than /opt/python/2.7.6/bin/python (e.g. the system version without the -dev/-devel package installed)? The safest bet for ensuring you're using the right python and that it's not going to have conflicting modules, I'd suggest using a Python virtualenv. This is easy to set up, just make sure you run it with the correct python executable, for example:% wget https://pypi.python.org/packages/source/v/virtualenv/virtualenv-1.11.4.tar.gz
% tar zxf virtualenv-1.11.4.tar.gz
% /opt/python/2.7.6/bin/python ./virtualenv-1.11.4/virtualenv.py galaxyvenv
% . ./galaxyvenv/bin/activate
% cd galaxy-dist
% python ./scripts/fetch_eggs.py
% LIBTORQUE_DIR=/opt/torque/active/lib python scripts/scramble.py -e pbs_python
--nate
On Mon, Mar 3, 2014 at 5:19 PM, Pete Schmitt <Peter.R.Schmitt@dartmouth.edu> wrote:
I uninstalled pbs_python 4.4.0 and reinstalled 4.3.5 as root (not using scramble)
When I try this method as the galaxy user:
LIBTORQUE_DIR=/opt/torque/active/lib python scripts/scramble.py -e pbs_python
I get the following output:
src/pbs_wrap.c:2813: warning: function declaration isn't a prototype
gcc -pthread -shared build/temp.linux-x86_64-2.7/src/pbs_wrap.o -L/opt/torque/4.2.7/lib -L. -ltorque -lpython2.7 -o build/lib.linux-x86_64-2.7/_pbs.so -L/opt/torque/4.2.7/lib -ltorque -Wl,-rpath -Wl,/opt/torque/4.2.7/lib
/usr/bin/ld: cannot find -lpython2.7
LD_LIBRARY_PATH="/opt/python/2.7.6/lib:/opt/torque/active/lib:/usr/local/lib"
I'm not sure why it can't find the libpython2.7.so file. When I built it as root there is a -L/opt/python/2.7.6/lib in that gcc line.
On 3/3/14, 4:13 PM, Nate Coraor wrote:
Hi Pete, Your subject says you are unable to build pbs_python using scramble. Could you provide details on what's not working there? Galaxy is not going to work with a different version of pbs_python unless a bit of hacking is done to make it attempt to do so. We test Galaxy with specific versions of its dependencies, which is why we control the versions of those dependencies and provide the scramble script to (hopefully) make it painless to build them yourself, should it be necessary to do so, as is always the case with pbs_python. --nate On Mon, Mar 3, 2014 at 3:57 PM, Pete Schmitt <Peter.R.Schmitt@dartmouth.edu> wrote:Following the directions from here: https://wiki.galaxyproject.org/Admin/Config/Performance/Cluster#PBS I'm trying to get pbs_python to work as I'm using torque for scheduling galaxy jobs. Note: This is a fresh install of galaxy from galaxy-dist on CentOS 5.10 I have pbs_python 4.4.0 module installed into a source-built version of python/2.7.6 I get the following error in the output of run.sh: galaxy.jobs INFO 2014-03-03 15:46:45,485 Handler 'main' will load all configured runner plugins Traceback (most recent call last): File "/nextgen3/galaxy/galaxy-dist-py27/lib/galaxy/webapps/galaxy/buildapp.py", line 39, in app_factory app = UniverseApplication( global_conf = global_conf, **kwargs ) File "/nextgen3/galaxy/galaxy-dist-py27/lib/galaxy/app.py", line 130, in __init__ self.job_manager = manager.JobManager( self ) File "/nextgen3/galaxy/galaxy-dist-py27/lib/galaxy/jobs/manager.py", line 31, in __init__ self.job_handler = handler.JobHandler( app ) File "/nextgen3/galaxy/galaxy-dist-py27/lib/galaxy/jobs/handler.py", line 30, in __init__ self.dispatcher = DefaultJobDispatcher( app ) File "/nextgen3/galaxy/galaxy-dist-py27/lib/galaxy/jobs/handler.py", line 568, in __init__ self.job_runners = self.app.job_config.get_job_runner_plugins( self.app.config.server_name ) File "/nextgen3/galaxy/galaxy-dist-py27/lib/galaxy/jobs/__init__.py", line 449, in get_job_runner_plugins module = __import__( module_name ) File "/nextgen3/galaxy/galaxy-dist-py27/lib/galaxy/jobs/runners/pbs.py", line 31, in <module> raise Exception( egg_message % str( e ) ) Exception: The 'pbs' runner depends on 'pbs_python' which is not installed or not configured properly. Galaxy's "scramble" system should make this installation simple, please follow the instructions found at: http://wiki.galaxyproject.org/Admin/Config/Performance/Cluster Additional errors may follow: pbs-python==4.3.5 This is the job_conf.xml file: <?xml version="1.0"?> <job_conf> <plugins> <plugin id="pbs" type="runner" load="galaxy.jobs.runners.pbs:PBSJobRunner"/> </plugins> <handlers> <handler id="dirigo"/> </handlers> <destinations default="pbs_default"> <destination id="pbs_default" runner="pbs"/> <param id="Resource_List">walltime=72:00:00,nodes=1:ppn=4</param> </destinations> </job_conf> I did not use the scramble system to install the pbs_python module. I downloaded the latest version available and installed it from the root account. -- Pete Schmitt ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: http://lists.bx.psu.edu/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
--
Pete Schmitt Technical Director: Discovery Cluster NH INBRE Grid Computational Genetics Lab Institute for Quantitative Biomedical Sciences Dartmouth College, HB 6203 L12 Berry/Baker Library Hanover, NH 03755 Phone: 603-646-8109 http://discovery.dartmouth.edu http://columbia.dartmouth.edu/grid http://www.epistasis.org http://iQBS.org
--
Pete Schmitt Technical Director: Discovery Cluster NH INBRE Grid Computational Genetics Lab Institute for Quantitative Biomedical Sciences Dartmouth College, HB 6203 L12 Berry/Baker Library Hanover, NH 03755 Phone: 603-646-8109 http://discovery.dartmouth.edu http://columbia.dartmouth.edu/grid http://www.epistasis.org http://iQBS.org
--
Pete Schmitt Technical Director: Discovery Cluster NH INBRE Grid Computational Genetics Lab Institute for Quantitative Biomedical Sciences Dartmouth College, HB 6203 L12 Berry/Baker Library Hanover, NH 03755 Phone: 603-646-8109 http://discovery.dartmouth.edu http://columbia.dartmouth.edu/grid http://www.epistasis.org http://iQBS.org