1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/f5f6cd1e9383/
Changeset: f5f6cd1e9383
User: jmchilton
Date: 2014-05-13 07:46:41
Summary: Improvements to LWR documentation.
Add separate examples for MQ driven stuff - this should clear up things and make it clear which new parameters are required when using message queue and which are no longer required. Add more documentation for remote dependency resolution. Document rewrite_parameters.
Affected #: 1 file
diff -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 -r f5f6cd1e938304094c9491da10824f93cbfb2d27 job_conf.xml.sample_advanced
--- a/job_conf.xml.sample_advanced
+++ b/job_conf.xml.sample_advanced
@@ -23,11 +23,6 @@
<!-- More information on LWR can be found at https://lwr.readthedocs.org --><!-- Uncomment following line to use libcurl to perform HTTP calls (defaults to urllib) --><!-- <param id="transport">curl</param> -->
- <!-- Uncomment following parameters (second optional) to target a message
- queue, ensure jobs_directory is specified on destinations and all
- file actions are remote executable. -->
- <!-- <param id="url">amqp://guest:guest@localhost:5672//</param> -->
- <!-- <param id="manager">_default_</param> --><!-- *Experimental Caching*: Uncomment next parameters to enable
caching and specify the number of caching threads to enable on Galaxy
side. Likely will not work with newer features such as MQ support.
@@ -37,6 +32,16 @@
<!-- <param id="cache">True</param> --><!-- <param id="transfer_threads">2</param> --></plugin>
+ <plugin id="amqp_lwr" type="runner" load="galaxy.jobs.runners.lwr:LwrJobRunner">
+ <param id="url">amqp://guest:guest@localhost:5672//</param>
+ <!-- If using message queue driven LWR - the LWR will generally
+ initiate file transfers so a the URL of this Galaxy instance
+ must be configured. -->
+ <param id="galaxy_url">http://localhost:8080/</param>
+ <!-- If multiple managers configured on the LWR, specify which one
+ this plugin targets. -->
+ <!-- <param id="manager">_default_</param> -->
+ </plugin><plugin id="cli" type="runner" load="galaxy.jobs.runners.cli:ShellJobRunner" /><plugin id="condor" type="runner" load="galaxy.jobs.runners.condor:CondorJobRunner" /><plugin id="slurm" type="runner" load="galaxy.jobs.runners.slurm:SlurmJobRunner" />
@@ -116,7 +121,11 @@
<!-- Uncomment the following statement to disable file staging (e.g.
if there is a shared file system between Galaxy and the LWR
server). Alternatively action can be set to 'copy' - to replace
- http transfers with file system copies. -->
+ http transfers with file system copies, 'remote_transfer' to cause
+ the lwr to initiate HTTP transfers instead of Galaxy, or
+ 'remote_copy' to cause lwr to initiate file system copies.
+ If setting this to 'remote_transfer' be sure to specify a
+ 'galaxy_url' attribute on the runner plugin above. --><!-- <param id="default_file_action">none</param> --><!-- The above option is just the default, the transfer behavior
none|copy|http can be configured on a per path basis via the
@@ -128,24 +137,29 @@
<!-- Uncomment following option to disable Galaxy tool dependency
resolution and utilize remote LWR's configuraiton of tool
dependency resolution instead (same options as Galaxy for
- dependency resolution are available in LWR).
+ dependency resolution are available in LWR). At a minimum
+ the remote LWR server should define a tool_dependencies_dir in
+ its `server.ini` configuration. The LWR will not attempt to
+ stage dependencies - so ensure the the required galaxy or tool
+ shed packages are available remotely (exact same tool shed
+ installed changesets are required).
--><!-- <param id="dependency_resolution">remote</params> -->
+ <!-- Traditionally, the LWR allow Galaxy to generate a command line
+ as if it were going to run the command locally and then the
+ LWR client rewrites it after the fact using regular
+ expressions. Setting the following value to true causes the
+ LWR runner to insert itself into the command line generation
+ process and generate the correct command line from the get go.
+ This will likely be the default someday - but requires a newer
+ LWR version and is less well tested. -->
+ <!-- <param id="rewrite_parameters">true</params> --><!-- Uncomment following option to enable setting metadata on remote
LWR server. The 'use_remote_datatypes' option is available for
determining whether to use remotely configured datatypes or local
ones (both alternatives are a little brittle). --><!-- <param id="remote_metadata">true</param> --><!-- <param id="use_remote_datatypes">false</param> -->
- <!-- Traditionally, the LWR client sends request to LWR
- server to populate various system properties. This
- extra step can be disabled and these calculated here
- on client by uncommenting jobs_directory and
- specifying any additional remote_property_ of
- interest. When using message queues this is nessecary
- not optional.
- -->
- <!-- <param id="jobs_directory">/path/to/remote/lwr/lwr_staging/</param> --><!-- <param id="remote_property_galaxy_home">/path/to/remote/galaxy-central</param> --><!-- If remote LWR server is configured to run jobs as the real user,
uncomment the following line to pass the current Galaxy user
@@ -157,6 +171,27 @@
--><!-- <param id="submit_native_specification">-P bignodes -R y -pe threads 8</param> --></destination>
+ <destination id="amqp_lwr_dest" runner="amqp_lwr" >
+ <!-- url and private_token are not valid when using MQ driven LWR. The plugin above
+ determines which queue/manager to target and the underlying MQ server should be
+ used to configure security.
+ -->
+ <!-- Traditionally, the LWR client sends request to LWR
+ server to populate various system properties. This
+ extra step can be disabled and these calculated here
+ on client by uncommenting jobs_directory and
+ specifying any additional remote_property_ of
+ interest, this is not optional when using message
+ queues.
+ -->
+ <param id="jobs_directory">/path/to/remote/lwr/lwr_staging/</param>
+ <!-- Default the LWR send files to and pull files from Galaxy when
+ using message queues (in the more traditional mode Galaxy sends
+ files to and pull files from the LWR - this is obviously less
+ appropriate when using a message queue).
+ -->
+ <param id="default_file_action">remote_transfer</param>
+ </destination><destination id="ssh_torque" runner="cli"><param id="shell_plugin">SecureShell</param><param id="job_plugin">Torque</param>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/48ee04030f89/
Changeset: 48ee04030f89
User: jmchilton
Date: 2014-05-13 07:13:02
Summary: Implement serializable class that can describe full dependency context for remote dependency resolution.
Previously LWR jut serialized tool requirements, by transitioning to this class LWR should be able to resolve tool shed installed packages as well - requiring the additional repository and tool dependency context.
Affected #: 2 files
diff -r 3b29b6f83d17ed07d2c3ea49d839b29d66cad580 -r 48ee04030f893a9a9657b639bb38819fb4404388 lib/galaxy/tools/deps/dependencies.py
--- /dev/null
+++ b/lib/galaxy/tools/deps/dependencies.py
@@ -0,0 +1,68 @@
+from galaxy.tools.deps.requirements import ToolRequirement
+from galaxy.util import bunch
+
+
+class DependenciesDescription(object):
+ """ Capture (in a readily serializable way) context related a tool
+ dependencies - both the tool's listed requirements and the tool shed
+ related context required to resolve dependencies via the
+ ToolShedPackageDependencyResolver.
+
+ This is meant to enable remote resolution of dependencies, by the LWR or
+ other potential remote execution mechanisms.
+ """
+
+ def __init__(self, requirements=[], installed_tool_dependencies=[]):
+ self.requirements = requirements
+ # tool shed installed tool dependencies...
+ self.installed_tool_dependencies = installed_tool_dependencies
+
+ def to_dict(self):
+ return dict(
+ requirements=[r.to_dict() for r in self.requirements],
+ installed_tool_dependencies=[DependenciesDescription._toolshed_install_dependency_to_dict(d) for d in self.installed_tool_dependencies]
+ )
+
+ @staticmethod
+ def from_dict(as_dict):
+ if as_dict is None:
+ return None
+
+ requirements_dicts = as_dict.get('requirements', [])
+ requirements = [ToolRequirement.from_dict(r) for r in requirements_dicts]
+ installed_tool_dependencies_dicts = as_dict.get('installed_tool_dependencies', [])
+ installed_tool_dependencies = map(DependenciesDescription._toolshed_install_dependency_from_dict, installed_tool_dependencies_dicts)
+ return DependenciesDescription(
+ requirements=requirements,
+ installed_tool_dependencies=installed_tool_dependencies
+ )
+
+ @staticmethod
+ def _toolshed_install_dependency_from_dict(as_dict):
+ # Rather than requiring full models in LWR, just use simple objects
+ # containing only properties and associations used to resolve
+ # dependencies for tool execution.
+ repository_object = bunch.Bunch(
+ name=as_dict['repository_name'],
+ owner=as_dict['repository_owner'],
+ installed_changeset_revision=as_dict['repository_installed_changeset'],
+ )
+ dependency_object = bunch.Bunch(
+ name=as_dict['dependency_name'],
+ version=as_dict['dependency_version'],
+ type=as_dict['dependency_type'],
+ tool_shed_repository=repository_object,
+ )
+ return dependency_object
+
+ @staticmethod
+ def _toolshed_install_dependency_to_dict(tool_dependency):
+ tool_shed_repository = tool_dependency.tool_shed_repository
+ return dict(
+ dependency_name=tool_dependency.name,
+ dependency_version=tool_dependency.version,
+ dependency_type=tool_dependency.type,
+ repository_name=tool_shed_repository.name,
+ repository_owner=tool_shed_repository.owner,
+ repository_installed_changeset=tool_shed_repository.installed_changeset_revision,
+ )
diff -r 3b29b6f83d17ed07d2c3ea49d839b29d66cad580 -r 48ee04030f893a9a9657b639bb38819fb4404388 test/unit/tools/test_tool_dependency_description.py
--- /dev/null
+++ b/test/unit/tools/test_tool_dependency_description.py
@@ -0,0 +1,44 @@
+from galaxy.model import tool_shed_install
+from galaxy.tools.deps import requirements
+from galaxy.tools.deps import dependencies
+
+
+def test_serialization():
+ repository = tool_shed_install.ToolShedRepository(
+ owner="devteam",
+ name="tophat",
+ installed_changeset_revision="abcdefghijk",
+ )
+ dependency = tool_shed_install.ToolDependency(
+ name="tophat",
+ version="2.0",
+ type="package",
+ status=tool_shed_install.ToolDependency.installation_status.INSTALLED,
+ )
+ dependency.tool_shed_repository = repository
+ tool_requirement = requirements.ToolRequirement(
+ name="tophat",
+ version="2.0",
+ type="package",
+ )
+ descript = dependencies.DependenciesDescription(
+ requirements=[tool_requirement],
+ installed_tool_dependencies=[dependency],
+ )
+ result_descript = dependencies.DependenciesDescription.from_dict(
+ descript.to_dict()
+ )
+ result_requirement = result_descript.requirements[0]
+ assert result_requirement.name == "tophat"
+ assert result_requirement.version == "2.0"
+ assert result_requirement.type == "package"
+
+ result_tool_shed_dependency = result_descript.installed_tool_dependencies[0]
+ result_tool_shed_dependency.name = "tophat"
+ result_tool_shed_dependency.version = "2.0"
+ result_tool_shed_dependency.type = "package"
+ result_tool_shed_repository = result_tool_shed_dependency.tool_shed_repository
+ result_tool_shed_repository.name = "tophat"
+ result_tool_shed_repository.owner = "devteam"
+ result_tool_shed_repository.installed_changeset_revision = "abcdefghijk"
+
https://bitbucket.org/galaxy/galaxy-central/commits/24b5759f33f8/
Changeset: 24b5759f33f8
User: jmchilton
Date: 2014-05-13 07:13:02
Summary: Update LWR client through LWR changeset 9fb7fcb.
Among other smaller fixes, this update causes the LWR client/runner to pass more tool dependency context through to the remote LWR server when `dependency_resolution` is set to `remote`. This additional context allows tool shed packages to be resolved remotely (i.e. allows proper use of ToolShedPackageDependencyResolver on the remote LWR server).
It remains the responsibility of the deployer to ensure identical versions of tool shed packages are installed on the remote LWR server and the local Galaxy instance.
Affected #: 11 files
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr.py
--- a/lib/galaxy/jobs/runners/lwr.py
+++ b/lib/galaxy/jobs/runners/lwr.py
@@ -5,6 +5,7 @@
from galaxy.jobs import ComputeEnvironment
from galaxy.jobs import JobDestination
from galaxy.jobs.command_factory import build_command
+from galaxy.tools.deps import dependencies
from galaxy.util import string_as_bool_or_none
from galaxy.util.bunch import Bunch
@@ -118,9 +119,7 @@
return
try:
- dependency_resolution = LwrJobRunner.__dependency_resolution( client )
- remote_dependency_resolution = dependency_resolution == "remote"
- requirements = job_wrapper.tool.requirements if remote_dependency_resolution else []
+ dependencies_description = LwrJobRunner.__dependencies_description( client, job_wrapper )
rewrite_paths = not LwrJobRunner.__rewrite_parameters( client )
unstructured_path_rewrites = {}
if compute_environment:
@@ -133,7 +132,7 @@
working_directory=job_wrapper.working_directory,
tool=job_wrapper.tool,
config_files=job_wrapper.extra_filenames,
- requirements=requirements,
+ dependencies_description=dependencies_description,
env=client.env,
rewrite_paths=rewrite_paths,
arbitrary_files=unstructured_path_rewrites,
@@ -376,6 +375,19 @@
return client_outputs
@staticmethod
+ def __dependencies_description( lwr_client, job_wrapper ):
+ dependency_resolution = LwrJobRunner.__dependency_resolution( lwr_client )
+ remote_dependency_resolution = dependency_resolution == "remote"
+ if not remote_dependency_resolution:
+ return None
+ requirements = job_wrapper.tool.requirements or []
+ installed_tool_dependencies = job_wrapper.tool.installed_tool_dependencies or []
+ return dependencies.DependenciesDescription(
+ requirements=requirements,
+ installed_tool_dependencies=installed_tool_dependencies,
+ )
+
+ @staticmethod
def __dependency_resolution( lwr_client ):
dependency_resolution = lwr_client.destination_params.get( "dependency_resolution", "local" )
if dependency_resolution not in ["none", "local", "remote"]:
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/__init__.py
--- a/lib/galaxy/jobs/runners/lwr_client/__init__.py
+++ b/lib/galaxy/jobs/runners/lwr_client/__init__.py
@@ -4,6 +4,39 @@
This module contains logic for interfacing with an external LWR server.
+------------------
+Configuring Galaxy
+------------------
+
+Galaxy job runners are configured in Galaxy's ``job_conf.xml`` file. See ``job_conf.xml.sample_advanced``
+in your Galaxy code base or on
+`Bitbucket <https://bitbucket.org/galaxy/galaxy-dist/src/tip/job_conf.xml.sample_advanc…>`_
+for information on how to configure Galaxy to interact with the LWR.
+
+Galaxy also supports an older, less rich configuration of job runners directly
+in its main ``universe_wsgi.ini`` file. The following section describes how to
+configure Galaxy to communicate with the LWR in this legacy mode.
+
+Legacy
+------
+
+A Galaxy tool can be configured to be executed remotely via LWR by
+adding a line to the ``universe_wsgi.ini`` file under the
+``galaxy:tool_runners`` section with the format::
+
+ <tool_id> = lwr://http://<lwr_host>:<lwr_port>
+
+As an example, if a host named remotehost is running the LWR server
+application on port ``8913``, then the tool with id ``test_tool`` can
+be configured to run remotely on remotehost by adding the following
+line to ``universe.ini``::
+
+ test_tool = lwr://http://remotehost:8913
+
+Remember this must be added after the ``[galaxy:tool_runners]`` header
+in the ``universe.ini`` file.
+
+
"""
from .staging.down import finish_job
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/action_mapper.py
--- a/lib/galaxy/jobs/runners/lwr_client/action_mapper.py
+++ b/lib/galaxy/jobs/runners/lwr_client/action_mapper.py
@@ -166,8 +166,8 @@
action_type = mapper.action_type
file_lister = mapper.file_lister
if type in ["workdir", "output_workdir"] and action_type == "none":
- ## We are changing the working_directory relative to what
- ## Galaxy would use, these need to be copied over.
+ # We are changing the working_directory relative to what
+ # Galaxy would use, these need to be copied over.
action_type = "copy"
action_class = actions.get(action_type, None)
if action_class is None:
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/amqp_exchange.py
--- a/lib/galaxy/jobs/runners/lwr_client/amqp_exchange.py
+++ b/lib/galaxy/jobs/runners/lwr_client/amqp_exchange.py
@@ -13,8 +13,8 @@
DEFAULT_EXCHANGE_NAME = "lwr"
DEFAULT_EXCHANGE_TYPE = "direct"
-DEFAULT_TIMEOUT = 0.2 # Set timeout to periodically give up looking and check
- # if polling should end.
+# Set timeout to periodically give up looking and check if polling should end.
+DEFAULT_TIMEOUT = 0.2
class LwrExchange(object):
@@ -48,7 +48,6 @@
queue = self.__queue(queue_name)
with self.connection(self.__url, **connection_kwargs) as connection:
with kombu.Consumer(connection, queues=[queue], callbacks=[callback], accept=['json']):
- log.debug("Consuming queue %s" % queue)
while check:
try:
connection.drain_events(timeout=self.__timeout)
@@ -59,7 +58,6 @@
with self.connection(self.__url) as connection:
with pools.producers[connection].acquire() as producer:
key = self.__queue_name(name)
- log.debug("Publishing with key %s and payload %s" % (key, payload))
producer.publish(
payload,
serializer='json',
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/client.py
--- a/lib/galaxy/jobs/runners/lwr_client/client.py
+++ b/lib/galaxy/jobs/runners/lwr_client/client.py
@@ -39,9 +39,8 @@
)
else:
job_directory = None
- self.env = destination_params.get( "env", [] )
+ self.env = destination_params.get("env", [])
self.files_endpoint = destination_params.get("files_endpoint", None)
- self.env = destination_params.get("env", [])
self.job_directory = job_directory
self.default_file_action = self.destination_params.get("default_file_action", "transfer")
@@ -83,7 +82,7 @@
super(JobClient, self).__init__(destination_params, job_id)
self.job_manager_interface = job_manager_interface
- def launch(self, command_line, requirements=[], env=[], remote_staging=[], job_config=None):
+ def launch(self, command_line, dependencies_description=None, env=[], remote_staging=[], job_config=None):
"""
Queue up the execution of the supplied `command_line` on the remote
server. Called launch for historical reasons, should be renamed to
@@ -98,8 +97,8 @@
submit_params_dict = submit_params(self.destination_params)
if submit_params_dict:
launch_params['params'] = dumps(submit_params_dict)
- if requirements:
- launch_params['requirements'] = dumps([requirement.to_dict() for requirement in requirements])
+ if dependencies_description:
+ launch_params['dependencies_description'] = dumps(dependencies_description.to_dict())
if env:
launch_params['env'] = dumps(env)
if remote_staging:
@@ -248,8 +247,8 @@
return self._raw_execute(self._upload_file_action(args), args, contents, input_path)
def _upload_file_action(self, args):
- ## Hack for backward compatibility, instead of using new upload_file
- ## path. Use old paths.
+ # Hack for backward compatibility, instead of using new upload_file
+ # path. Use old paths.
input_type = args['input_type']
action = {
# For backward compatibility just target upload_input_extra for all
@@ -295,15 +294,15 @@
raise Exception(error_message)
self.client_manager = client_manager
- def launch(self, command_line, requirements=[], env=[], remote_staging=[], job_config=None):
+ def launch(self, command_line, dependencies_description=None, env=[], remote_staging=[], job_config=None):
"""
"""
launch_params = dict(command_line=command_line, job_id=self.job_id)
submit_params_dict = submit_params(self.destination_params)
if submit_params_dict:
launch_params['params'] = submit_params_dict
- if requirements:
- launch_params['requirements'] = [requirement.to_dict() for requirement in requirements]
+ if dependencies_description:
+ launch_params['dependencies_description'] = dependencies_description.to_dict()
if env:
launch_params['env'] = env
if remote_staging:
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/destination.py
--- a/lib/galaxy/jobs/runners/lwr_client/destination.py
+++ b/lib/galaxy/jobs/runners/lwr_client/destination.py
@@ -31,9 +31,9 @@
if not url.endswith("/"):
url += "/"
- ## Check for private token embedded in the URL. A URL of the form
- ## https://moo@cow:8913 will try to contact https://cow:8913
- ## with a private key of moo
+ # Check for private token embedded in the URL. A URL of the form
+ # https://moo@cow:8913 will try to contact https://cow:8913
+ # with a private key of moo
private_token_format = "https?://(.*)@.*/?"
private_token_match = match(private_token_format, url)
private_token = None
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/interface.py
--- a/lib/galaxy/jobs/runners/lwr_client/interface.py
+++ b/lib/galaxy/jobs/runners/lwr_client/interface.py
@@ -67,8 +67,8 @@
self.object_store = object_store
def __app_args(self):
- ## Arguments that would be specified from LwrApp if running
- ## in web server.
+ # Arguments that would be specified from LwrApp if running
+ # in web server.
return {
'manager': self.job_manager,
'file_cache': self.file_cache,
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/staging/__init__.py
--- a/lib/galaxy/jobs/runners/lwr_client/staging/__init__.py
+++ b/lib/galaxy/jobs/runners/lwr_client/staging/__init__.py
@@ -30,8 +30,9 @@
be transferred to remote server).
working_directory : str
Local path created by Galaxy for running this job.
- requirements : list
- List of requirements for tool execution.
+ dependencies_description : list
+ galaxy.tools.deps.dependencies.DependencyDescription object describing
+ tool dependency context for remote depenency resolution.
env: list
List of dict object describing environment variables to populate.
version_file : str
@@ -55,7 +56,7 @@
input_files,
client_outputs,
working_directory,
- requirements,
+ dependencies_description=None,
env=[],
arbitrary_files=None,
rewrite_paths=True,
@@ -66,7 +67,7 @@
self.input_files = input_files
self.client_outputs = client_outputs
self.working_directory = working_directory
- self.requirements = requirements
+ self.dependencies_description = dependencies_description
self.env = env
self.rewrite_paths = rewrite_paths
self.arbitrary_files = arbitrary_files or {}
@@ -79,6 +80,15 @@
def version_file(self):
return self.client_outputs.version_file
+ @property
+ def tool_dependencies(self):
+ if not self.remote_dependency_resolution:
+ return None
+ return dict(
+ requirements=(self.tool.requirements or []),
+ installed_tool_dependencies=(self.tool.installed_tool_dependencies or [])
+ )
+
class ClientOutputs(object):
""" Abstraction describing the output datasets EXPECTED by the Galaxy job
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/staging/up.py
--- a/lib/galaxy/jobs/runners/lwr_client/staging/up.py
+++ b/lib/galaxy/jobs/runners/lwr_client/staging/up.py
@@ -24,7 +24,7 @@
job_id = file_stager.job_id
launch_kwds = dict(
command_line=rebuilt_command_line,
- requirements=client_job_description.requirements,
+ dependencies_description=client_job_description.dependencies_description,
env=client_job_description.env,
)
if file_stager.job_config:
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/jobs/runners/lwr_client/transport/__init__.py
--- a/lib/galaxy/jobs/runners/lwr_client/transport/__init__.py
+++ b/lib/galaxy/jobs/runners/lwr_client/transport/__init__.py
@@ -15,8 +15,8 @@
def __get_transport_type(transport_type, os_module):
if not transport_type:
use_curl = os_module.getenv('LWR_CURL_TRANSPORT', "0")
- ## If LWR_CURL_TRANSPORT is unset or set to 0, use default,
- ## else use curl.
+ # If LWR_CURL_TRANSPORT is unset or set to 0, use default,
+ # else use curl.
if use_curl.isdigit() and not int(use_curl):
transport_type = 'urllib'
else:
diff -r 48ee04030f893a9a9657b639bb38819fb4404388 -r 24b5759f33f87879e945a2f14ef5e8abfce488c1 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -2673,12 +2673,18 @@
def build_dependency_shell_commands( self ):
"""Return a list of commands to be run to populate the current environment to include this tools requirements."""
+ return self.app.toolbox.dependency_manager.dependency_shell_commands(
+ self.requirements,
+ installed_tool_dependencies=self.installed_tool_dependencies
+ )
+
+ @property
+ def installed_tool_dependencies(self):
if self.tool_shed_repository:
installed_tool_dependencies = self.tool_shed_repository.tool_dependencies_installed_or_in_error
else:
installed_tool_dependencies = None
- return self.app.toolbox.dependency_manager.dependency_shell_commands( self.requirements,
- installed_tool_dependencies=installed_tool_dependencies )
+ return installed_tool_dependencies
def build_redirect_url_params( self, param_dict ):
"""
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/246b419fc92a/
Changeset: 246b419fc92a
User: dan
Date: 2014-05-12 21:27:36
Summary: When working with <conversions> in DataToolParameters, have the parsed conv_types be a list, as is now expected in subsequent usage.
Affected #: 1 file
diff -r 2b7e97a34afa6b33e9950df9213b3cb9be9026a3 -r 246b419fc92a91f31eb7cdd5e2de4a4c15a4b631 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1659,7 +1659,7 @@
conv_extensions = conv_elem.get( "type" ) # target datatype extension
# FIXME: conv_extensions should be able to be an ordered list
assert None not in [ name, type ], 'A name (%s) and type (%s) are required for explicit conversion' % ( name, type )
- conv_types = tool.app.datatypes_registry.get_datatype_by_extension( conv_extensions.lower() )
+ conv_types = [ tool.app.datatypes_registry.get_datatype_by_extension( conv_extensions.lower() ) ]
self.conversions.append( ( name, conv_extensions, conv_types ) )
def get_html_field( self, trans=None, value=None, other_values={} ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/2b7e97a34afa/
Changeset: 2b7e97a34afa
User: martenson
Date: 2014-05-12 20:56:06
Summary: important docs for select2 backbone view
Affected #: 2 files
diff -r 8e2135ee278aa44eb5a8da04a392bfa44efc0279 -r 2b7e97a34afa6b33e9950df9213b3cb9be9026a3 static/scripts/mvc/ui/ui-select.js
--- a/static/scripts/mvc/ui/ui-select.js
+++ b/static/scripts/mvc/ui/ui-select.js
@@ -1,7 +1,12 @@
// dependencies
define(['utils/utils'], function(Utils) {
-// plugin
+/**
+ * A plugin for initializing select2 input items.
+ * Make sure the select2 library itself is loaded beforehand.
+ * Also the element to which select2 will be appended has to
+ * be created before select2 initialization (and passed as option).
+ */
var View = Backbone.View.extend(
{
// options
@@ -181,13 +186,13 @@
},
// element
- _template: function(options) {
+ _template: function() {
return '<input type="hidden"/>';
}
});
return {
View : View
-}
+};
});
diff -r 8e2135ee278aa44eb5a8da04a392bfa44efc0279 -r 2b7e97a34afa6b33e9950df9213b3cb9be9026a3 static/scripts/packed/mvc/ui/ui-select.js
--- a/static/scripts/packed/mvc/ui/ui-select.js
+++ b/static/scripts/packed/mvc/ui/ui-select.js
@@ -1,1 +1,1 @@
-define(["utils/utils"],function(a){var b=Backbone.View.extend({optionsDefault:{css:"",placeholder:"No data available",data:[],value:null},initialize:function(d){this.options=a.merge(d,this.optionsDefault);this.setElement(this._template(this.options));if(!this.options.container){console.log("ui-select::initialize() : container not specified.");return}this.options.container.append(this.$el);this.select_data=this.options.data;this._refresh();if(this.options.value){this._setValue(this.options.value)}var c=this;if(this.options.onchange){this.$el.on("change",function(){c.options.onchange(c.value())})}},value:function(c){var d=this._getValue();if(c!==undefined){this._setValue(c)}var e=this._getValue();if((e!=d&&this.options.onchange)){this.options.onchange(e)}return e},text:function(){return this.$el.select2("data").text},disabled:function(){return !this.$el.select2("enable")},enable:function(){this.$el.select2("enable",true)},disable:function(){this.$el.select2("enable",false)},add:function(c){this.select_data.push({id:c.id,text:c.text});this._refresh()},del:function(d){var c=this._getIndex(d);if(c!=-1){this.select_data.splice(c,1);this._refresh()}},remove:function(){this.$el.select2("destroy")},update:function(c){this.select_data=[];for(var d in c.data){this.select_data.push(c.data[d])}this._refresh()},_refresh:function(){var c=this._getValue();this.$el.select2({data:this.select_data,containerCssClass:this.options.css,placeholder:this.options.placeholder});this._setValue(c)},_getIndex:function(d){for(var c in this.select_data){if(this.select_data[c].id==d){return c}}return -1},_getValue:function(){return this.$el.select2("val")},_setValue:function(d){var c=this._getIndex(d);if(c==-1){if(this.select_data.length>0){d=this.select_data[0].id}}this.$el.select2("val",d)},_template:function(c){return'<input type="hidden"/>'}});return{View:b}});
\ No newline at end of file
+define(["utils/utils"],function(a){var b=Backbone.View.extend({optionsDefault:{css:"",placeholder:"No data available",data:[],value:null},initialize:function(d){this.options=a.merge(d,this.optionsDefault);this.setElement(this._template(this.options));if(!this.options.container){console.log("ui-select::initialize() : container not specified.");return}this.options.container.append(this.$el);this.select_data=this.options.data;this._refresh();if(this.options.value){this._setValue(this.options.value)}var c=this;if(this.options.onchange){this.$el.on("change",function(){c.options.onchange(c.value())})}},value:function(c){var d=this._getValue();if(c!==undefined){this._setValue(c)}var e=this._getValue();if((e!=d&&this.options.onchange)){this.options.onchange(e)}return e},text:function(){return this.$el.select2("data").text},disabled:function(){return !this.$el.select2("enable")},enable:function(){this.$el.select2("enable",true)},disable:function(){this.$el.select2("enable",false)},add:function(c){this.select_data.push({id:c.id,text:c.text});this._refresh()},del:function(d){var c=this._getIndex(d);if(c!=-1){this.select_data.splice(c,1);this._refresh()}},remove:function(){this.$el.select2("destroy")},update:function(c){this.select_data=[];for(var d in c.data){this.select_data.push(c.data[d])}this._refresh()},_refresh:function(){var c=this._getValue();this.$el.select2({data:this.select_data,containerCssClass:this.options.css,placeholder:this.options.placeholder});this._setValue(c)},_getIndex:function(d){for(var c in this.select_data){if(this.select_data[c].id==d){return c}}return -1},_getValue:function(){return this.$el.select2("val")},_setValue:function(d){var c=this._getIndex(d);if(c==-1){if(this.select_data.length>0){d=this.select_data[0].id}}this.$el.select2("val",d)},_template:function(){return'<input type="hidden"/>'}});return{View:b}});
\ No newline at end of file
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/8e2135ee278a/
Changeset: 8e2135ee278a
User: greg
Date: 2014-05-12 18:47:57
Summary: Fix typo in the Tool Shed's package recipe install step handler - thanks Bjoren Gruning.
Affected #: 1 file
diff -r 7950ab309ad1cb06cf3af973adcca1e3209c9241 -r 8e2135ee278aa44eb5a8da04a392bfa44efc0279 lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/recipe/step_handler.py
@@ -1337,7 +1337,7 @@
"""
env_vars = dict()
env_vars = install_environment.environment_dict()
- tool_shed_repository = tool_depenedncy.tool_shed_repository
+ tool_shed_repository = tool_dependency.tool_shed_repository
env_vars.update( td_common_util.get_env_var_values( install_environment ) )
language = action_dict[ 'language' ]
with settings( warn_only=True, **env_vars ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/a74fd6395843/
Changeset: a74fd6395843
User: Jeremy Goecks
Date: 2014-05-09 22:20:33
Summary: Add example configuration for CLI Slurm runner.
Affected #: 1 file
diff -r ae7da2ebe7737d6e6ba142d1ac24a24b467285ad -r a74fd6395843cb384928ad367fd6f5f6d5eff65c job_conf.xml.sample_advanced
--- a/job_conf.xml.sample_advanced
+++ b/job_conf.xml.sample_advanced
@@ -164,6 +164,18 @@
<param id="shell_hostname">foo.example.org</param><param id="job_Resource_List">walltime=24:00:00,ncpus=4</param></destination>
+
+ <!-- Example CLI Slurm runner. -->
+ <destination id="ssh_slurm" runner="cli">
+ <param id="shell_plugin">SecureShell</param>
+ <param id="job_plugin">Slurm</param>
+ <param id="shell_username">foo</param>
+ <param id="shell_hostname">my_host</param>
+ <param id="job_time">2:00:00</param>
+ <param id="job_ncpus">4</param>
+ <param id="job_partition">my_partition</param>
+ </destination>
+
<destination id="condor" runner="condor"><!-- With no params, jobs are submitted to the 'vanilla' universe with:
notification = NEVER
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.