I'm trying to figure out if I can do this all through the API (so I can skip setting up FTP servers and sharing database servers). 
I can scan one system, and initiate downloads on the destination system (using upload1). So as far as moving files from one machine to another, it should be fine. I could push all the data, with correct names, annotations and tags. 
But then it becomes a matter of the pushing the metadata onto the object on the destination system. But there are two problems, 
1) No way to push tool info and input parameter data, everything would be a product of 'upload1'. 
2) No global IDs. If I try to sync a second time, it may be difficult to figure out which elements have been previously pushed. Comparing names could lead to problems...

Kyle



On Fri, Jan 4, 2013 at 4:54 AM, Rémy Dernat <remy.d1@gmail.com> wrote:
Hi,

A good practise is to create a ftp server http://wiki.galaxyproject.org/Admin/Config/Upload%20via%20FTP and use a tool to send / retrieve informations to this ftp server :
http://toolshed.g2.bx.psu.edu/
-> Data Source / data_nfs

Then export your ftp directory by NFS to your galaxy installations.

For databases, it is a little bit complex. If you have a database server, you could share access to a single database, but your installation between your server should be the same, and all working directories must be shared with NFS on all galaxy servers...

Regards


2012/12/6 Kyle Ellrott <kellrott@soe.ucsc.edu>
Is there any documentation on the transfer manager?
Is this a mechanism that I could use to synchronize data libraries between two different Galaxy installations?

Kyle

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/