Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au> W mcri.edu.au<https://www.mcri.edu.au/> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication.
Hi Sandra We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however. Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy). Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Objet : [galaxy-dev] pbs-python issues Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au> W mcri.edu.au<https://www.mcri.edu.au/> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/
Hi Fred, For us, PBS (free) is installed to a shared filesystem that is mounted via NFS to the galaxy server (and via GPFS to the rest of the HPC). The server home directory is also custom (/opt) but local to the disk on each HPC node (including galaxy). I have tested running a simple PBS script from the galaxy node as the galaxy user and it was able to run successfully. In your case, have you installed PBS Pro with all default installations paths? It seems to me that the pbs-python module is happy to be installed with PBS defaults but has trouble with a custom PBS install. I am having to look into the pbs-python installer script to hunt down the relevant variables. Re: parse error – I had configured a basic job_conf.xml to test with galaxy, but it seems it was too basic. Are you perchance able to provide an example of your job_conf.xml for reference? Thanks, Sandra From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org> Sent: Wednesday, 28 August 2019 5:09 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Subject: [galaxy-dev] Re: pbs-python issues Hi Sandra We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however. Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy). Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Objet : [galaxy-dev] pbs-python issues Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>> W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/>> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/>
Looking more closely at the parse error, it appears that galaxy cannot find the PBS plugin (pbs_python installed from source): for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): As expected, pbs_python can happily detect my PBS installation when I run its PBSQuery.py script. The source and python installers are noticeably different upon inspection, unfortunately I still haven’t managed to get the python installation working yet. Here is my job_conf.xml: <plugins> <plugin id="pbs" type="runner" load="galaxy.jobs.runners.pbs:PBSJobRunner"/> </plugins> <destinations default="pbs_default"> <destination id="pbs_default" runner="pbs"/> <destination id="other_cluster" runner="pbs"> <param id="destination">@other.cluster</param> </destination> </destinations> Thanks, Sandra From: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au> Sent: Thursday, 29 August 2019 10:08 AM To: 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Subject: [galaxy-dev] Re: pbs-python issues Hi Fred, For us, PBS (free) is installed to a shared filesystem that is mounted via NFS to the galaxy server (and via GPFS to the rest of the HPC). The server home directory is also custom (/opt) but local to the disk on each HPC node (including galaxy). I have tested running a simple PBS script from the galaxy node as the galaxy user and it was able to run successfully. In your case, have you installed PBS Pro with all default installations paths? It seems to me that the pbs-python module is happy to be installed with PBS defaults but has trouble with a custom PBS install. I am having to look into the pbs-python installer script to hunt down the relevant variables. Re: parse error – I had configured a basic job_conf.xml to test with galaxy, but it seems it was too basic. Are you perchance able to provide an example of your job_conf.xml for reference? Thanks, Sandra From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Sent: Wednesday, 28 August 2019 5:09 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Subject: [galaxy-dev] Re: pbs-python issues Hi Sandra We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however. Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy). Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Objet : [galaxy-dev] pbs-python issues Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>> W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>>> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/>
Sandra Do you have installed DRMAA ? https://docs.galaxyproject.org/en/latest/admin/cluster.html#drmaa for us is was : http://downloads.sourceforge.net/project/pbspro-drmaa/pbs-drmaa/1.0/pbs-drma... Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au> Envoyé : jeudi 29 août 2019 03:49 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Objet : [galaxy-dev] Re: pbs-python issues Looking more closely at the parse error, it appears that galaxy cannot find the PBS plugin (pbs_python installed from source): for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): As expected, pbs_python can happily detect my PBS installation when I run its PBSQuery.py script. The source and python installers are noticeably different upon inspection, unfortunately I still haven’t managed to get the python installation working yet. Here is my job_conf.xml: <plugins> <plugin id="pbs" type="runner" load="galaxy.jobs.runners.pbs:PBSJobRunner"/> </plugins> <destinations default="pbs_default"> <destination id="pbs_default" runner="pbs"/> <destination id="other_cluster" runner="pbs"> <param id="destination">@other.cluster</param> </destination> </destinations> Thanks, Sandra From: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au> Sent: Thursday, 29 August 2019 10:08 AM To: 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Subject: [galaxy-dev] Re: pbs-python issues Hi Fred, For us, PBS (free) is installed to a shared filesystem that is mounted via NFS to the galaxy server (and via GPFS to the rest of the HPC). The server home directory is also custom (/opt) but local to the disk on each HPC node (including galaxy). I have tested running a simple PBS script from the galaxy node as the galaxy user and it was able to run successfully. In your case, have you installed PBS Pro with all default installations paths? It seems to me that the pbs-python module is happy to be installed with PBS defaults but has trouble with a custom PBS install. I am having to look into the pbs-python installer script to hunt down the relevant variables. Re: parse error – I had configured a basic job_conf.xml to test with galaxy, but it seems it was too basic. Are you perchance able to provide an example of your job_conf.xml for reference? Thanks, Sandra From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Sent: Wednesday, 28 August 2019 5:09 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Subject: [galaxy-dev] Re: pbs-python issues Hi Sandra We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however. Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy). Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Objet : [galaxy-dev] pbs-python issues Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>> W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>>> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/
Hi Fred, Thanks for your job_conf example. Could I confirm whether it’s the case that each tool needs to be individually configured as per specific instance requirements? I’m having a problem with certain tools not being found, or different tools running successfully but output not being displayed correctly. I guess I’m mainly interested whether this might indicate an installation issue or whether there is more individual configuration that needs to be done? Ps. I ended up getting pbs_python from source working so I’m trying out the PBSJobRunner at the moment, although I have noticed that specifying a qsub flag as a param option (i.e. pretty much anything other than “Resource_List”) doesn’t actually work – I can’t get the job to submit to a particular queue. And it doesn’t seem like it has an equivalent to the “nativeSpecification” param like pbs-dramaa does. Have you had any experience with PBS-python vs. PBS-drmaa? I’d be open to switching if the latter turns out to be more reliable/flexible. Cheers, Sandra From: SAPET, Frederic <Frederic.SAPET@biogemma.com> Sent: Friday, 30 August 2019 5:52 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Subject: RE: pbs-python issues Sandra Do you have installed DRMAA ? https://docs.galaxyproject.org/en/latest/admin/cluster.html#drmaa<https://docs.galaxyproject.org/en/latest/admin/cluster.html#drmaa> for us is was : http://downloads.sourceforge.net/project/pbspro-drmaa/pbs-drmaa/1.0/pbs-drmaa-1.0.17.tar.gz<http://downloads.sourceforge.net/project/pbspro-drmaa/pbs-drmaa/1.0/pbs-drmaa-1.0.17.tar.gz> Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Envoyé : jeudi 29 août 2019 03:49 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Objet : [galaxy-dev] Re: pbs-python issues Looking more closely at the parse error, it appears that galaxy cannot find the PBS plugin (pbs_python installed from source): for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): As expected, pbs_python can happily detect my PBS installation when I run its PBSQuery.py script. The source and python installers are noticeably different upon inspection, unfortunately I still haven’t managed to get the python installation working yet. Here is my job_conf.xml: <plugins> <plugin id="pbs" type="runner" load="galaxy.jobs.runners.pbs:PBSJobRunner"/> </plugins> <destinations default="pbs_default"> <destination id="pbs_default" runner="pbs"/> <destination id="other_cluster" runner="pbs"> <param id="destination">@other.cluster</param> </destination> </destinations> Thanks, Sandra From: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Sent: Thursday, 29 August 2019 10:08 AM To: 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Subject: [galaxy-dev] Re: pbs-python issues Hi Fred, For us, PBS (free) is installed to a shared filesystem that is mounted via NFS to the galaxy server (and via GPFS to the rest of the HPC). The server home directory is also custom (/opt) but local to the disk on each HPC node (including galaxy). I have tested running a simple PBS script from the galaxy node as the galaxy user and it was able to run successfully. In your case, have you installed PBS Pro with all default installations paths? It seems to me that the pbs-python module is happy to be installed with PBS defaults but has trouble with a custom PBS install. I am having to look into the pbs-python installer script to hunt down the relevant variables. Re: parse error – I had configured a basic job_conf.xml to test with galaxy, but it seems it was too basic. Are you perchance able to provide an example of your job_conf.xml for reference? Thanks, Sandra From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Sent: Wednesday, 28 August 2019 5:09 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Subject: [galaxy-dev] Re: pbs-python issues Hi Sandra We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however. Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy). Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>>> Objet : [galaxy-dev] pbs-python issues Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>> W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>><https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>>>> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>><http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>><http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/>
Hi, Yes, each to need to be individually configured. But you can set a default destination : <destinations default="Universe"> So all tools will used this one. That's why for the upload tool, we set another destination (local) and yet another one for BlastN (UniverseBlast). So all tools will use the default "Universe", all but upload and blastn! <tools> <tool id="upload1" destination="local"/> <tool id="ncbi_blastn_wrapper" destination="UniverseBlast"/> </tools> We work here with PBS-DRMAA. PBS-python is not installed in the galaxy virtualenv. We have DRMAA_PATH and DRMAA_LIBRARY_PATH set as env variables. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au> Envoyé : mardi 3 septembre 2019 07:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Objet : [galaxy-dev] Re: pbs-python issues Hi Fred, Thanks for your job_conf example. Could I confirm whether it’s the case that each tool needs to be individually configured as per specific instance requirements? I’m having a problem with certain tools not being found, or different tools running successfully but output not being displayed correctly. I guess I’m mainly interested whether this might indicate an installation issue or whether there is more individual configuration that needs to be done? Ps. I ended up getting pbs_python from source working so I’m trying out the PBSJobRunner at the moment, although I have noticed that specifying a qsub flag as a param option (i.e. pretty much anything other than “Resource_List”) doesn’t actually work – I can’t get the job to submit to a particular queue. And it doesn’t seem like it has an equivalent to the “nativeSpecification” param like pbs-dramaa does. Have you had any experience with PBS-python vs. PBS-drmaa? I’d be open to switching if the latter turns out to be more reliable/flexible. Cheers, Sandra From: SAPET, Frederic <Frederic.SAPET@biogemma.com> Sent: Friday, 30 August 2019 5:52 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Subject: RE: pbs-python issues Sandra Do you have installed DRMAA ? https://docs.galaxyproject.org/en/latest/admin/cluster.html#drmaa<https://docs.galaxyproject.org/en/latest/admin/cluster.html#drmaa> for us is was : http://downloads.sourceforge.net/project/pbspro-drmaa/pbs-drmaa/1.0/pbs-drmaa-1.0.17.tar.gz<http://downloads.sourceforge.net/project/pbspro-drmaa/pbs-drmaa/1.0/pbs-drmaa-1.0.17.tar.gz> Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Envoyé : jeudi 29 août 2019 03:49 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Objet : [galaxy-dev] Re: pbs-python issues Looking more closely at the parse error, it appears that galaxy cannot find the PBS plugin (pbs_python installed from source): for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): As expected, pbs_python can happily detect my PBS installation when I run its PBSQuery.py script. The source and python installers are noticeably different upon inspection, unfortunately I still haven’t managed to get the python installation working yet. Here is my job_conf.xml: <plugins> <plugin id="pbs" type="runner" load="galaxy.jobs.runners.pbs:PBSJobRunner"/> </plugins> <destinations default="pbs_default"> <destination id="pbs_default" runner="pbs"/> <destination id="other_cluster" runner="pbs"> <param id="destination">@other.cluster</param> </destination> </destinations> Thanks, Sandra From: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Sent: Thursday, 29 August 2019 10:08 AM To: 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Subject: [galaxy-dev] Re: pbs-python issues Hi Fred, For us, PBS (free) is installed to a shared filesystem that is mounted via NFS to the galaxy server (and via GPFS to the rest of the HPC). The server home directory is also custom (/opt) but local to the disk on each HPC node (including galaxy). I have tested running a simple PBS script from the galaxy node as the galaxy user and it was able to run successfully. In your case, have you installed PBS Pro with all default installations paths? It seems to me that the pbs-python module is happy to be installed with PBS defaults but has trouble with a custom PBS install. I am having to look into the pbs-python installer script to hunt down the relevant variables. Re: parse error – I had configured a basic job_conf.xml to test with galaxy, but it seems it was too basic. Are you perchance able to provide an example of your job_conf.xml for reference? Thanks, Sandra From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Sent: Wednesday, 28 August 2019 5:09 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Subject: [galaxy-dev] Re: pbs-python issues Hi Sandra We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however. Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy). Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>>> Objet : [galaxy-dev] pbs-python issues Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>> W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>><https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>>>> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>><http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>><http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/
Hi Sandra, We're using PBS-python using a dynamic job destination : <destinations default="dynamic"> <destination id="local" runner="local"> <param id="embed_metadata_in_job">True</param> </destination> <destination id="dynamic" runner="dynamic" > <param id="type">python</param> <param id="function">default_runner</param> </destination> </destinations This dynamic_runner function is defined in lib/galaxy/jobs/rules/200_runners.py . There you can do more or less whatever you want. select queue, walltime and resources based on job_id, input file size, etc. We provide queue and account using this syntax when returning the jobDestination: return JobDestination(runner=destination,params={"Resource_List":resources,"-A":account,"-q":queue}) Best, Geert On 03.09.19 11:20, SAPET, Frederic via galaxy-dev wrote:
Hi,
Yes, each to need to be individually configured.
But you can set a default destination : <destinations default="Universe">
So all tools will used this one. That's why for the upload tool, we set another destination (local) and yet another one for BlastN (UniverseBlast). So all tools will use the default "Universe", all but upload and blastn! <tools> <tool id="upload1" destination="local"/> <tool id="ncbi_blastn_wrapper" destination="UniverseBlast"/> </tools>
We work here with PBS-DRMAA. PBS-python is not installed in the galaxy virtualenv.
We have DRMAA_PATH and DRMAA_LIBRARY_PATH set as env variables.
Fred
-----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au> Envoyé : mardi 3 septembre 2019 07:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Objet : [galaxy-dev] Re: pbs-python issues
Hi Fred,
Thanks for your job_conf example. Could I confirm whether it’s the case that each tool needs to be individually configured as per specific instance requirements? I’m having a problem with certain tools not being found, or different tools running successfully but output not being displayed correctly. I guess I’m mainly interested whether this might indicate an installation issue or whether there is more individual configuration that needs to be done?
Ps. I ended up getting pbs_python from source working so I’m trying out the PBSJobRunner at the moment, although I have noticed that specifying a qsub flag as a param option (i.e. pretty much anything other than “Resource_List”) doesn’t actually work – I can’t get the job to submit to a particular queue. And it doesn’t seem like it has an equivalent to the “nativeSpecification” param like pbs-dramaa does. Have you had any experience with PBS-python vs. PBS-drmaa? I’d be open to switching if the latter turns out to be more reliable/flexible.
Cheers, Sandra
From: SAPET, Frederic <Frederic.SAPET@biogemma.com> Sent: Friday, 30 August 2019 5:52 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Subject: RE: pbs-python issues
Sandra
Do you have installed DRMAA ? https://docs.galaxyproject.org/en/latest/admin/cluster.html#drmaa<https://docs.galaxyproject.org/en/latest/admin/cluster.html#drmaa>
Fred
-----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Envoyé : jeudi 29 août 2019 03:49 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Objet : [galaxy-dev] Re: pbs-python issues
Looking more closely at the parse error, it appears that galaxy cannot find the PBS plugin (pbs_python installed from source):
for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'):
As expected, pbs_python can happily detect my PBS installation when I run its PBSQuery.py script. The source and python installers are noticeably different upon inspection, unfortunately I still haven’t managed to get the python installation working yet.
Here is my job_conf.xml:
<plugins> <plugin id="pbs" type="runner" load="galaxy.jobs.runners.pbs:PBSJobRunner"/> </plugins> <destinations default="pbs_default"> <destination id="pbs_default" runner="pbs"/> <destination id="other_cluster" runner="pbs"> <param id="destination">@other.cluster</param> </destination> </destinations>
Thanks, Sandra
From: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Sent: Thursday, 29 August 2019 10:08 AM To: 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Subject: [galaxy-dev] Re: pbs-python issues
Hi Fred,
For us, PBS (free) is installed to a shared filesystem that is mounted via NFS to the galaxy server (and via GPFS to the rest of the HPC). The server home directory is also custom (/opt) but local to the disk on each HPC node (including galaxy).
I have tested running a simple PBS script from the galaxy node as the galaxy user and it was able to run successfully. In your case, have you installed PBS Pro with all default installations paths? It seems to me that the pbs-python module is happy to be installed with PBS defaults but has trouble with a custom PBS install. I am having to look into the pbs-python installer script to hunt down the relevant variables.
Re: parse error – I had configured a basic job_conf.xml to test with galaxy, but it seems it was too basic. Are you perchance able to provide an example of your job_conf.xml for reference?
Thanks, Sandra
From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Sent: Wednesday, 28 August 2019 5:09 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>> Subject: [galaxy-dev] Re: pbs-python issues
Hi Sandra
We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however.
Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy).
Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue.
Fred
-----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org%3cmailto:galaxy-dev@lists.galaxyproject.org>>>> Objet : [galaxy-dev] pbs-python issues
Hi all,
As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?).
... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1
So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment.
galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0
I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past?
Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this?
And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected?
Thanks,
Sandra Maksimovic Systems Administrator Information Technology
Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia
T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>>>> W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>><https://www.mcri.edu.au/<https://www.mcri.edu.au/><https://www.mcri.edu.au/<https://www.mcri.edu.au/>>>>
Disclaimer
This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s
To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>><http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s
To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>><http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s
To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/><http://galaxyproject.org/search/<http://galaxyproject.org/search/>> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s
To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s
To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/ ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s
To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/
-- Geert Vandeweyer, Ph.D. Department of Medical Genetics Cognitive Genetics University of Antwerp Prins Boudewijnlaan 43/6 2650 Edegem Belgium Tel: +32 (0)3 275 97 56 E-mail: geert.vandeweyer@uantwerpen.be https://www.uantwerpen.be/en/rg/cognet/ http://www.linkedin.com/pub/geert-vandeweyer/26/457/726 -- Email Efficiency Disclaimer: I read & answer emails between 8 AM and 9 AM. I'm not availabe on thursdays. For urgent matters during the day, call me. --
Hi Sandra This is the same way for us here, NFS, GPFs, and PBS is in /opt/pbs. Here is a sample of our job_conf: <job_conf> <plugins> <plugin id="local" type="runner" load="galaxy.jobs.runners.local:LocalJobRunner" workers="4"/> <plugin id="drmaa" type="runner" load="galaxy.jobs.runners.drmaa:DRMAAJobRunner"/> </plugins> <destinations default="Universe"> <destination id="local" runner="local"/> <destination id="Universe" runner="drmaa"> <param id="nativeSpecification">-q soft -l select=1:ncpus=2:mem=64gb -l walltime=50:00:00 </param> </destination> <destination id="UniverseBlast" runner="drmaa"> <param id="nativeSpecification">-q soft -l select=1:ncpus=6:mem=80gb -l walltime=150:00:00 </param> </destination> </destinations> <tools> <tool id="upload1" destination="local"/> <tool id="ncbi_blastn_wrapper" destination="UniverseBlast"/> </tools> </job_conf> I hope it will help you. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au> Envoyé : jeudi 29 août 2019 02:08 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Objet : [galaxy-dev] Re: pbs-python issues Hi Fred, For us, PBS (free) is installed to a shared filesystem that is mounted via NFS to the galaxy server (and via GPFS to the rest of the HPC). The server home directory is also custom (/opt) but local to the disk on each HPC node (including galaxy). I have tested running a simple PBS script from the galaxy node as the galaxy user and it was able to run successfully. In your case, have you installed PBS Pro with all default installations paths? It seems to me that the pbs-python module is happy to be installed with PBS defaults but has trouble with a custom PBS install. I am having to look into the pbs-python installer script to hunt down the relevant variables. Re: parse error – I had configured a basic job_conf.xml to test with galaxy, but it seems it was too basic. Are you perchance able to provide an example of your job_conf.xml for reference? Thanks, Sandra From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org> Sent: Wednesday, 28 August 2019 5:09 PM To: Sandra Maksimovic <sandra.maksimovic@mcri.edu.au>; 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org> Subject: [galaxy-dev] Re: pbs-python issues Hi Sandra We're running a Galaxy with PBS Pro as jobs scheduler. I never had any problem with pbs-python however. Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy). Are you sure that PBS is well installed ? Are you able to launch a simple PBS script with the galaxy user ? Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML config file has an issue. Fred -----Message d'origine----- De : Sandra Maksimovic <sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au>> Envoyé : mercredi 28 août 2019 02:36 À : 'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>> Objet : [galaxy-dev] pbs-python issues Hi all, As a galaxy newbie, I'm struggling to get the pbs-python module working on our galaxy instance. PBS is installed to a custom location on a shared filesystem mounted via NFS to our galaxy server, however, installing pbs-python using the git clone / python venv method in the documentation fails because it can't find PBS and there does not appear to be any way to define it for pbs-python (I could be wrong?). ... gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 -I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o In file included from src/C++/pbs_wrap.cxx:2978:0: src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or directory #include "u_hash_map_structs.h" ^ compilation terminated. error: command 'gcc' failed with exit status 1 So I went ahead and installed pbs_python from source (which does allow you define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not seem to like this as evidenced by errors during startup. I suspect this has to do with pbs_python not being installed into the galaxy virtual environment. galaxy[97486]: Traceback (most recent call last): galaxy[97486]: File "<string>", line 1, in <module> galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 179, in optional galaxy[97486]: conditional = ConditionalDependencies(config_file) galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 32, in __init__ galaxy[97486]: self.parse_configs() galaxy[97486]: File "/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", line 41, in parse_configs galaxy[97486]: for plugin in ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'): galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse galaxy[97486]: tree.parse(source, parser) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, in parse galaxy[97486]: parser.feed(data) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, in feed galaxy[97486]: self._raiseerror(v) galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror galaxy[97486]: raise err galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: line 4, column 0 I've attempted reinstalls of the git clone / python venv method with the PBS environment variable to no avail. I was wondering if someone might have any ideas about working around this roadblock, or may have encountered a similar module installation issue in the past? Also, I was hoping to get some ideas/examples of generic job_conf.xml definitions for PBS clusters? Things like best practices, caveats, etc. My understanding is that, unless configured otherwise, galaxy will submit jobs as the galaxy user and that configuring the server to run jobs as end users themselves is difficult/risky. Just wondering if you guys might have opinions/thoughts/recommendations about this? And finally, what would be a good way to test that galaxy is submitting jobs to the queue properly? Is there some generic test data/procedure to verify that the galaxy instance is working as expected? Thanks, Sandra Maksimovic Systems Administrator Information Technology Murdoch Children's Research Institute The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 Australia T +61 3 8341 6498 E sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au<mailto:sandra.maksimovic@mcri.edu.au%3cmailto:sandra.maksimovic@mcri.edu.au>> W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/>> Disclaimer This e-mail and any attachments to it (the "Communication") are, unless otherwise stated, confidential, may contain copyright material and is for the use only of the intended recipient. If you receive the Communication in error, please notify the sender immediately by return e-mail, delete the Communication and the return e-mail, and do not read, copy, retransmit or otherwise deal with it. Any views expressed in the Communication are those of the individual sender only, unless expressly stated to be those of Murdoch Children’s Research Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does not accept liability in connection with the integrity of or errors in the Communication, computer virus, data corruption, interference or delay arising from or in respect of the Communication. ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/<http://galaxyproject.org/search/> ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: %(web_page_url)s To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/
participants (3)
-
Geert Vandeweyer
-
Sandra Maksimovic
-
SAPET, Frederic