Glad you found a workaround - sorry it is such a... sub-optimal one. The silent failure does indeed make these issues very difficult to track down. As was mentioned in the other thread this is a fairly beta feature. There are any number of things that can be done to improve it - such as better feedback - I doubt these are particular hard challenges they would just take developer time. If you feel this feature is useful and these issues important I would encourage you to vote on this Trello card - https://trello.com/c/qCfAWeYU. -John On Thu, Feb 6, 2014 at 11:48 AM, graham etherington (TSL) <graham.etherington@sainsbury-laboratory.ac.uk> wrote:
Hi John, As the Galaxy instance I'm importing into is quite a new build, I indeed had 'cleanup_job=never' for debugging purposes. I changed this (to 'onsuccess') and I was able to import a history from my account on the public Galaxy server, but still not my other local instance. I think there must be some sort of local networking issue going on here and that the history import is just failing silently. I have managed now though to download the Galaxy_history.tar.gz file, place it on a pubic server and import it from there, so at least that's a little hack that I can use for now. Anyway, other than the silent fail, I don't think this is a Galaxy problem.
Thanks for all your help! Best wishes, Graham
Dr. Graham Etherington Bioinformatics Support Officer, The Sainsbury Laboratory, Norwich Research Park, Norwich NR4 7UH. UK Tel: +44 (0)1603 450601
From: John Chilton <chilton@msi.umn.edu> Date: Thursday, 6 February 2014 16:54
To: "graham etherington (TSL)" <graham.etherington@sainsbury-laboratory.ac.uk> Cc: Galaxy Dev <galaxy-dev@lists.bx.psu.edu> Subject: Re: [galaxy-dev] Running Import History from File as local job
Any chance you have cleanup_job=never in universe_wsgi.ini? I recently pushed a bugfix to galaxy-central - history import wouldn't work if that is set.
Otherwise you may want to review this thread for clues:
http://lists.bx.psu.edu/pipermail/galaxy-dev/2013-December/017773.html
-John
On Thu, Feb 6, 2014 at 10:34 AM, graham etherington (TSL) <graham.etherington@sainsbury-laboratory.ac.uk> wrote:
Hi John, Thanks for your suggestion. I included the suggested line in my job_conf.xml file and sure enough the job was run locally. It exits without any errors, but unfortunately it still didn't work - the imported history doesn't appear in the history panel. Here's the output from the paster log...
galaxy.jobs DEBUG 2014-02-06 14:48:38,463 (714) Working directory for job is: /tsl/services/galaxy/dist/galaxy-dist/database/job_working_directory/000/714 galaxy.jobs.handler DEBUG 2014-02-06 14:48:38,471 (714) Dispatching to local runner galaxy.jobs DEBUG 2014-02-06 14:48:38,544 (714) Persisting job destination (destination id: local) galaxy.jobs.handler INFO 2014-02-06 14:48:38,595 (714) Job dispatched galaxy.jobs.runners.local DEBUG 2014-02-06 14:48:38,854 (714) executing: export GALAXY_SLOTS="1"; python /tsl/services/galaxy/dist/galaxy-dist/lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py http://galaxy.tsl.ac.uk/u/ethering/h/galaxy-intro /tsl/services/galaxy/dist/galaxy-dist/database/tmp/tmp4yCMyk --url galaxy.jobs DEBUG 2014-02-06 14:48:38,936 (714) Persisting job destination (destination id: local) galaxy.jobs.runners.local DEBUG 2014-02-06 14:48:45,024 execution finished: export GALAXY_SLOTS="1"; python /tsl/services/galaxy/dist/galaxy-dist/lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py http://galaxy.tsl.ac.uk/u/ethering/h/galaxy-intro /tsl/services/galaxy/dist/galaxy-dist/database/tmp/tmp4yCMyk --url galaxy.jobs DEBUG 2014-02-06 14:48:45,179 job 714 ended
...and here's a skimmed down version of my job_conf.xml file (TSL-Test128 is the default LSF queue that Galaxy jobs are sent to)
<?xml version="1.0"?> <job_conf> <plugins workers="8"> <plugin id="local" type="runner" load="galaxy.jobs.runners.local:LocalJobRunner" workers="16"/> <plugin id="drmaa" type="runner" load="galaxy.jobs.runners.drmaa:DRMAAJobRunner" workers="8"/> </plugins> <handlers default="handlerLocal"> <handler id="main" tags="handlerLocal"/> </handlers> <destinations default="TSL-Test128"> <destination id="local" runner="local"/> <destination id="TSL-Test128" runner="drmaa" tags="LSF,Test128"> </destination> </destinations> <tools default="local"> <!--make the import histories tool run locally--> <tool id="__IMPORT_HISTORY__" destination="local"/> </tools> </job_conf>
I'm presuming if there was something wrong with the xml that an error would be thrown, so I'm not sure if there's something else that I'm missing. Many thanks, Graham
From: John Chilton <chilton@msi.umn.edu> Date: Thursday, 6 February 2014 14:32 To: "graham etherington (TSL)" <graham.etherington@sainsbury-laboratory.ac.uk> Cc: Galaxy Dev <galaxy-dev@lists.bx.psu.edu> Subject: Re: [galaxy-dev] Running Import History from File as local job
Hello Dr. Etherington,
The id of the special tool is __IMPORT_HISTORY__ so if you have a local destination called local, the following tool entry SHOULD force local import - seemed to function the way I expected in a quick test anyway:
<tool id="__IMPORT_HISTORY__" destination="local" />
Hope this helps, -John
On Thu, Feb 6, 2014 at 6:12 AM, graham etherington (TSL) <graham.etherington@sainsbury-laboratory.ac.uk> wrote:
Hi, I have two local instances of Galaxy running (both entirely independent of each other and running on separate clusters). I'm trying to export a history from Instance A to Instance B, but so far have not been able. I've attempted this in two ways.
1. Download the Galaxy_history.tar.gz file from Instance A (Options > Export to File) and place it in a path on the filesystem which Galaxy B can see. Then Options > Import from File and in the 'Archived History URL:' field place: '/full/path/to/Galaxy_history.tar.gz' (I've also tried 'file:///full/path/to/Galaxy_history.tar.gz') The job runs (submitted to LSF cluster) with no errors in paster.log, but the stdout file in the job_working_directory gives the error Exception getting file from URL: unknown url type:/full/path/to/Galaxy_history.tar.gz <open file '<stderr>', mode 'w' at 0x2ba2d4e9b1e0> Error unpacking tar/gz archive: nothing to open <open file '<stderr>', mode 'w' at 0x2ba2d4e9b1e0>
2. Copy the url link for the History in Instance A (Options > Share or Publish ), then in Instance B go Options > Import from File and in the 'Archived History URL:' field place the url provided by Instance A. Again the job runs OK, but the stdout file has the error: Exception getting file from URL: <urlopen error [Errno 111] Connection refused> <open file '<stderr>', mode 'w' at 0x2b359790a1e0> Error unpacking tar/gz archive: nothing to open <open file '<stderr>', mode 'w' at 0x2b359790a1e0>
I'm pretty sure method number 2 is failing because the Import History job is being run on the cluster which cannot see the outside world (and hence Instance A). I think I should be able to overcome this by specifying in job_conf.xml that the import tool be run locally. The problem is, that I've not been able to identify what the actual tool or process is that runs the Import Histories method. I know that it produces a bash script which runs lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py, but not being a standard tool (i.e. found in the ./tools/ directory with an xml wrapper), I can't find an ID for it. Naively I tried the following in job_conf.xml: (I have 'local' defined in 'destinations' and 'plugins')
<tools default="local"> <!--make the import histories tool run locally--> <tool id="lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py" destination="local"/> </tools>
But this made no difference.
Does anyone know how I can either get the Import History tool to run locally or suggest another way to import my history?
Many thanks, Graham
Dr. Graham Etherington Bioinformatics Support Officer, The Sainsbury Laboratory, Norwich Research Park, Norwich NR4 7UH. UK Tel: +44 (0)1603 450601
___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: http://lists.bx.psu.edu/
To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/