Hi Tiziano,

I think you are correct in your assumption.
If you do not need to have uploads run on pulsar, you should be able to specify a local destination for uploads (the upload tool id is upload1) in your job_conf.xml.
There are some examples described here:
https://github.com/galaxyproject/galaxy/blob/dev/config/job_conf.xml.sample_advanced#L496
If you can place a copy of galaxy and a virtualenv on your pulsar server, you could also set this up as in
https://github.com/galaxyproject/galaxy/blob/dev/config/job_conf.xml.sample_advanced#L379

Note that I haven't tried this yet, but I think this is a good start. Let us know if that works.

Cheers,
Marius

On 9 May 2016 at 10:42, Tiziano Flati <tiziano.flati@gmail.com> wrote:
Hi all,

I have succesfully setup a Galaxy-Pulsar architecture and I am able to run jobs over datasets already uploaded on a history.

When I try to upload a new file to the history, though, the upload job fails with the following error:

Traceback (most recent call last):
  File "/home/flati/pulsar/files/staging/80/tool_files/upload.py", line 18, in <module>
    import galaxy.model # noqa
      ImportError: No module named model

Note: in job_conf.xml, Pulsar is the default destination:

<destinations default="win_pulsar">

Does someone know what the problem is?
I suspect that setting pulsar as the default destination causes the upload tool to run on Pulsar's side which, however, does not have access to Galaxy's lib directory (which contains the galaxy model module).

Any help is very appreciated,
Tiziano

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  https://lists.galaxyproject.org/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/mailinglists/