It sounds like the deferred job plugin will work as a mechanism to manage the jobs, once there is a tool that communicate to two different galaxy instances via the API and order them to start transferring/loading files. I think the tools '__EXPORT_HISTORY__' and '__IMPORT_HISTORY__' (under lib/galaxy/tools/imp_exp/), but with some options to select subsets of elements from a history, would be useful in the case. Then I just need to command them via the API.

But there still may be a problem with mapping IDs between the systems...

Kyle
 


On Wed, Jan 9, 2013 at 9:48 AM, Nate Coraor <nate@bx.psu.edu> wrote:
On Jan 7, 2013, at 1:52 PM, Kyle Ellrott wrote:

> I'm trying to figure out if I can do this all through the API (so I can skip setting up FTP servers and sharing database servers).
> I can scan one system, and initiate downloads on the destination system (using upload1). So as far as moving files from one machine to another, it should be fine. I could push all the data, with correct names, annotations and tags.
> But then it becomes a matter of the pushing the metadata onto the object on the destination system. But there are two problems,
> 1) No way to push tool info and input parameter data, everything would be a product of 'upload1'.
> 2) No global IDs. If I try to sync a second time, it may be difficult to figure out which elements have been previously pushed. Comparing names could lead to problems...

Hi Kyle,

The transfer manager was designed as a very general method of downloading data to a temporary location in a Galaxy-independent-but-manageable fashion.  To make it more useful, it's designed to be used with deferred jobs.  The idea is that you write a deferred job plugin that can be told when to initiate a task (like adding things to the transfer manager), create deferred jobs that depend on those transfers, and then take certain actions that you specify once those transfers have completed.  There's no documentation, but if you have a look at the existing plugins in lib/galaxy/jobs/deferred, that should give you an idea of how it works.

--nate

>
> Kyle
>
>
>
> On Fri, Jan 4, 2013 at 4:54 AM, Rémy Dernat <remy.d1@gmail.com> wrote:
> Hi,
>
> A good practise is to create a ftp server http://wiki.galaxyproject.org/Admin/Config/Upload%20via%20FTP and use a tool to send / retrieve informations to this ftp server :
> http://toolshed.g2.bx.psu.edu/
> -> Data Source / data_nfs
>
> Then export your ftp directory by NFS to your galaxy installations.
>
> For databases, it is a little bit complex. If you have a database server, you could share access to a single database, but your installation between your server should be the same, and all working directories must be shared with NFS on all galaxy servers...
>
> Regards
>
>
> 2012/12/6 Kyle Ellrott <kellrott@soe.ucsc.edu>
> Is there any documentation on the transfer manager?
> Is this a mechanism that I could use to synchronize data libraries between two different Galaxy installations?
>
> Kyle
>
> ___________________________________________________________
> Please keep all replies on the list by using "reply all"
> in your mail client.  To manage your subscriptions to this
> and other Galaxy lists, please use the interface at:
>
>   http://lists.bx.psu.edu/
>
>
> ___________________________________________________________
> Please keep all replies on the list by using "reply all"
> in your mail client.  To manage your subscriptions to this
> and other Galaxy lists, please use the interface at:
>
>  http://lists.bx.psu.edu/