galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
December 2012
- 1 participants
- 142 discussions
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/213219e966b4/
changeset: 213219e966b4
user: greg
date: 2013-01-01 00:39:02
summary: Tool shed utility import fixes.
affected #: 2 files
<HgDiff 213219e966b4:35e271f43ff1 ['lib/galaxy/util/shed_util_common.py']><HgDiff 213219e966b4:35e271f43ff1 ['lib/galaxy/webapps/community/controllers/common.py']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/35e271f43ff1/
changeset: 35e271f43ff1
user: greg
date: 2013-01-01 00:34:42
summary: Tool shed utility refactoring fix.
affected #: 2 files
<HgDiff 35e271f43ff1:7117f892dea8 ['lib/galaxy/util/shed_util_common.py']><HgDiff 35e271f43ff1:7117f892dea8 ['lib/galaxy/webapps/community/controllers/common.py']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Fix status color when monitoring tool shed repositories being installed that end in a missing repository dependencies status.
by Bitbucket 31 Dec '12
by Bitbucket 31 Dec '12
31 Dec '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/7117f892dea8/
changeset: 7117f892dea8
user: greg
date: 2012-12-31 17:59:36
summary: Fix status color when monitoring tool shed repositories being installed that end in a missing repository dependencies status.
affected #: 2 files
<HgDiff 7117f892dea8:81d16941cb2c ['lib/galaxy/util/shed_util.py']><HgDiff 7117f892dea8:81d16941cb2c ['templates/admin/tool_shed_repository/repository_installation_status.mako']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Fixes for reinstalling an uninstalled tool shed repository but choosing to noe handle repository dependencies.
by Bitbucket 30 Dec '12
by Bitbucket 30 Dec '12
30 Dec '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/81d16941cb2c/
changeset: 81d16941cb2c
user: greg
date: 2012-12-30 18:00:14
summary: Fixes for reinstalling an uninstalled tool shed repository but choosing to noe handle repository dependencies.
affected #: 1 file
<HgDiff 81d16941cb2c:f9f294f7c21e ['lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Fixes for reinstalling repository dependencies for tool shed repositories that are being reinstalled.
by Bitbucket 29 Dec '12
by Bitbucket 29 Dec '12
29 Dec '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/f9f294f7c21e/
changeset: f9f294f7c21e
user: greg
date: 2012-12-29 23:17:19
summary: Fixes for reinstalling repository dependencies for tool shed repositories that are being reinstalled.
affected #: 6 files
<HgDiff f9f294f7c21e:f329896aeb85 ['lib/galaxy/model/__init__.py']><HgDiff f9f294f7c21e:f329896aeb85 ['lib/galaxy/util/shed_util.py']><HgDiff f9f294f7c21e:f329896aeb85 ['lib/galaxy/util/shed_util_common.py']><HgDiff f9f294f7c21e:f329896aeb85 ['lib/galaxy/webapps/community/controllers/common.py']><HgDiff f9f294f7c21e:f329896aeb85 ['lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py']><HgDiff f9f294f7c21e:f329896aeb85 ['templates/admin/tool_shed_repository/select_tool_panel_section.mako']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Add exception handling when checking for updates for an installed tool shed repository.
by Bitbucket 28 Dec '12
by Bitbucket 28 Dec '12
28 Dec '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/f329896aeb85/
changeset: f329896aeb85
user: greg
date: 2012-12-28 22:15:59
summary: Add exception handling when checking for updates for an installed tool shed repository.
affected #: 1 file
<HgDiff f329896aeb85:7aa9121cdbe1 ['lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Fixes and improvements for importing a workflow contained in an installed tool shed repository into the Galaxy instance.
by Bitbucket 28 Dec '12
by Bitbucket 28 Dec '12
28 Dec '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/7aa9121cdbe1/
changeset: 7aa9121cdbe1
user: greg
date: 2012-12-28 21:52:15
summary: Fixes and improvements for importing a workflow contained in an installed tool shed repository into the Galaxy instance.
affected #: 3 files
<HgDiff 7aa9121cdbe1:c92f88b21a16 ['lib/galaxy/webapps/community/util/workflow_util.py']><HgDiff 7aa9121cdbe1:c92f88b21a16 ['lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py']><HgDiff 7aa9121cdbe1:c92f88b21a16 ['templates/admin/tool_shed_repository/view_workflow.mako']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dannon: Admin: Select all checkbox added for job management. See card: https://trello.com/card/select-all-option-for-admin-jobs/50686d0302dfa79d13d90c45/338
by Bitbucket 28 Dec '12
by Bitbucket 28 Dec '12
28 Dec '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/c92f88b21a16/
changeset: c92f88b21a16
user: dannon
date: 2012-12-28 17:21:49
summary: Admin: Select all checkbox added for job management. See card: https://trello.com/card/select-all-option-for-admin-jobs/50686d0302dfa79d13…
affected #: 1 file
<HgDiff c92f88b21a16:3b46eb7042c6 ['templates/admin/jobs.mako']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/3b46eb7042c6/
changeset: 3b46eb7042c6
user: dannon
date: 2012-12-28 16:51:03
summary: Admin UI: Make left panel scroll.
affected #: 1 file
<HgDiff 3b46eb7042c6:7073d786cad0 ['templates/webapps/galaxy/admin/index.mako']>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
27 Dec '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/7073d786cad0/
changeset: 7073d786cad0
user: greg
date: 2012-12-27 22:00:16
summary: Add the ability to view an SVG image of a workflow contained in a tool shed repository installed into a Galaxy instance. This feature is the same as the feature that has been available in the tool shed for some time.
affected #: 14 files
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/util/shed_util_common.py
--- a/lib/galaxy/util/shed_util_common.py
+++ b/lib/galaxy/util/shed_util_common.py
@@ -1,14 +1,14 @@
import os, shutil, tempfile, logging, string, threading, urllib2
from galaxy import util
from galaxy.tools import parameters
-from galaxy.util import inflector
-from galaxy.util import json
+from galaxy.util import inflector, json
from galaxy.web import url_for
from galaxy.web.form_builder import SelectField
from galaxy.webapps.community.util import container_util
from galaxy.datatypes import checkers
from galaxy.model.orm import and_
from galaxy.tools.parameters import dynamic_options
+from galaxy.tool_shed import encoding_util
from galaxy import eggs
import pkg_resources
@@ -176,7 +176,12 @@
containers_dict[ 'valid_tools' ] = valid_tools_root_folder
# Workflows container.
if workflows:
- folder_id, workflows_root_folder = container_util.build_workflows_folder( trans, folder_id, workflows, repository_metadata, label='Workflows' )
+ folder_id, workflows_root_folder = container_util.build_workflows_folder( trans=trans,
+ folder_id=folder_id,
+ workflows=workflows,
+ repository_metadata_id=None,
+ repository_id=repository_id,
+ label='Workflows' )
containers_dict[ 'workflows' ] = workflows_root_folder
except Exception, e:
log.debug( "Exception in build_repository_containers_for_galaxy: %s" % str( e ) )
@@ -252,7 +257,12 @@
# Workflows container.
if metadata and 'workflows' in metadata:
workflows = metadata[ 'workflows' ]
- folder_id, workflows_root_folder = container_util.build_workflows_folder( trans, folder_id, workflows, repository_metadata, label='Workflows' )
+ folder_id, workflows_root_folder = container_util.build_workflows_folder( trans=trans,
+ folder_id=folder_id,
+ workflows=workflows,
+ repository_metadata_id=repository_metadata.id,
+ repository_id=None,
+ label='Workflows' )
containers_dict[ 'workflows' ] = workflows_root_folder
except Exception, e:
log.debug( "Exception in build_repository_containers_for_tool_shed: %s" % str( e ) )
@@ -400,6 +410,25 @@
# tag for any tool in the repository.
break
return can_generate_dependency_metadata
+def can_use_tool_config_disk_file( trans, repository, repo, file_path, changeset_revision ):
+ """
+ Determine if repository's tool config file on disk can be used. This method is restricted to tool config files since, with the
+ exception of tool config files, multiple files with the same name will likely be in various directories in the repository and we're
+ comparing file names only (not relative paths).
+ """
+ if not file_path or not os.path.exists( file_path ):
+ # The file no longer exists on disk, so it must have been deleted at some previous point in the change log.
+ return False
+ if changeset_revision == repository.tip( trans.app ):
+ return True
+ file_name = strip_path( file_path )
+ latest_version_of_file = get_latest_tool_config_revision_from_repository_manifest( repo, file_name, changeset_revision )
+ can_use_disk_file = filecmp.cmp( file_path, latest_version_of_file )
+ try:
+ os.unlink( latest_version_of_file )
+ except:
+ pass
+ return can_use_disk_file
def check_tool_input_params( app, repo_dir, tool_config_name, tool, sample_files ):
"""
Check all of the tool's input parameters, looking for any that are dynamically generated using external data files to make
@@ -1339,6 +1368,16 @@
else:
metadata_dict[ 'workflows' ] = [ ( relative_path, exported_workflow_dict ) ]
return metadata_dict
+def get_absolute_path_to_file_in_repository( repo_files_dir, file_name ):
+ """Return the absolute path to a specified disk file containe in a repository."""
+ stripped_file_name = strip_path( file_name )
+ file_path = None
+ for root, dirs, files in os.walk( repo_files_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name == stripped_file_name:
+ return os.path.abspath( os.path.join( root, name ) )
+ return file_path
def get_changectx_for_changeset( repo, changeset_revision, **kwd ):
"""Retrieve a specified changectx from a repository"""
for changeset in repo.changelog:
@@ -1487,14 +1526,14 @@
repository_clone_url = os.path.join( tool_shed_url, 'repos', owner, name )
ctx_rev = get_ctx_rev( tool_shed_url, name, owner, installed_changeset_revision )
print "Adding new row (or updating an existing row) for repository '%s' in the tool_shed_repository table." % name
- repository = create_or_update_tool_shed_repository( app=self.app,
+ repository = create_or_update_tool_shed_repository( app=trans.app,
name=name,
description=None,
installed_changeset_revision=changeset_revision,
ctx_rev=ctx_rev,
repository_clone_url=repository_clone_url,
metadata_dict={},
- status=self.app.model.ToolShedRepository.installation_status.NEW,
+ status=trans.model.ToolShedRepository.installation_status.NEW,
current_changeset_revision=None,
owner=sowner,
dist_to_shed=False )
@@ -1712,6 +1751,9 @@
elif all_metadata_records:
return all_metadata_records[ 0 ]
return None
+def get_repository_metadata_by_id( trans, id ):
+ """Get repository metadata from the database"""
+ return trans.sa_session.query( trans.model.RepositoryMetadata ).get( trans.security.decode_id( id ) )
def get_repository_metadata_by_repository_id_changset_revision( trans, id, changeset_revision ):
"""Get a specified metadata record for a specified repository."""
return trans.sa_session.query( trans.model.RepositoryMetadata ) \
@@ -1822,6 +1864,10 @@
tool_path = shed_tool_conf_dict[ 'tool_path' ]
break
return tool_path
+def get_tool_shed_repository_by_id( trans, repository_id ):
+ return trans.sa_session.query( trans.model.ToolShedRepository ) \
+ .filter( trans.model.ToolShedRepository.table.c.id == trans.security.decode_id( repository_id ) ) \
+ .first()
def get_tool_shed_repository_by_shed_name_owner_changeset_revision( app, tool_shed, name, owner, changeset_revision ):
# This method is used only in Galaxy, not the tool shed.
sa_session = app.model.context.current
@@ -2110,6 +2156,46 @@
description = repository_dependencies_dict.get( 'description', None )
all_repository_dependencies[ 'description' ] = description
return all_repository_dependencies
+def load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config_filename ):
+ """
+ Return a loaded tool whose tool config file name (e.g., filtering.xml) is the value of tool_config_filename. The value of changeset_revision
+ is a valid (downloadable) changset revision. The tool config will be located in the repository manifest between the received valid changeset
+ revision and the first changeset revision in the repository, searching backwards.
+ """
+ original_tool_data_path = trans.app.config.tool_data_path
+ repository = get_repository_in_tool_shed( trans, repository_id )
+ repo_files_dir = repository.repo_path( trans.app )
+ repo = hg.repository( get_configured_ui(), repo_files_dir )
+ message = ''
+ tool = None
+ can_use_disk_file = False
+ tool_config_filepath = get_absolute_path_to_file_in_repository( repo_files_dir, tool_config_filename )
+ work_dir = tempfile.mkdtemp()
+ can_use_disk_file = can_use_tool_config_disk_file( trans, repository, repo, tool_config_filepath, changeset_revision )
+ if can_use_disk_file:
+ trans.app.config.tool_data_path = work_dir
+ tool, valid, message, sample_files = handle_sample_files_and_load_tool_from_disk( trans, repo_files_dir, tool_config_filepath, work_dir )
+ if tool is not None:
+ invalid_files_and_errors_tups = check_tool_input_params( trans.app,
+ repo_files_dir,
+ tool_config_filename,
+ tool,
+ sample_files )
+ if invalid_files_and_errors_tups:
+ message2 = generate_message_for_invalid_tools( trans,
+ invalid_files_and_errors_tups,
+ repository,
+ metadata_dict=None,
+ as_html=True,
+ displaying_invalid_tool=True )
+ message = concat_messages( message, message2 )
+ else:
+ tool, message, sample_files = handle_sample_files_and_load_tool_from_tmp_config( trans, repo, changeset_revision, tool_config_filename, work_dir )
+ remove_dir( work_dir )
+ trans.app.config.tool_data_path = original_tool_data_path
+ # Reset the tool_data_tables by loading the empty tool_data_table_conf.xml file.
+ reset_tool_data_tables( trans.app )
+ return repository, tool, message
def load_tool_from_config( app, full_path ):
try:
tool = app.toolbox.load_tool( full_path )
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/webapps/community/controllers/admin.py
--- a/lib/galaxy/webapps/community/controllers/admin.py
+++ b/lib/galaxy/webapps/community/controllers/admin.py
@@ -533,7 +533,7 @@
# The received id is a RepositoryMetadata object id, so we need to get the
# associated Repository and redirect to view_or_manage_repository with the
# changeset_revision.
- repository_metadata = common.get_repository_metadata_by_id( trans, kwd[ 'id' ] )
+ repository_metadata = suc.get_repository_metadata_by_id( trans, kwd[ 'id' ] )
repository = repository_metadata.repository
kwd[ 'id' ] = trans.security.encode_id( repository.id )
kwd[ 'changeset_revision' ] = repository_metadata.changeset_revision
@@ -615,7 +615,7 @@
ids = util.listify( id )
count = 0
for repository_metadata_id in ids:
- repository_metadata = common.get_repository_metadata_by_id( trans, repository_metadata_id )
+ repository_metadata = suc.get_repository_metadata_by_id( trans, repository_metadata_id )
trans.sa_session.delete( repository_metadata )
trans.sa_session.flush()
count += 1
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/webapps/community/controllers/common.py
--- a/lib/galaxy/webapps/community/controllers/common.py
+++ b/lib/galaxy/webapps/community/controllers/common.py
@@ -113,25 +113,6 @@
repository_metadata.tool_versions = tool_versions_dict
trans.sa_session.add( repository_metadata )
trans.sa_session.flush()
-def can_use_tool_config_disk_file( trans, repository, repo, file_path, changeset_revision ):
- """
- Determine if repository's tool config file on disk can be used. This method is restricted to tool config files since, with the
- exception of tool config files, multiple files with the same name will likely be in various directories in the repository and we're
- comparing file names only (not relative paths).
- """
- if not file_path or not os.path.exists( file_path ):
- # The file no longer exists on disk, so it must have been deleted at some previous point in the change log.
- return False
- if changeset_revision == repository.tip( trans.app ):
- return True
- file_name = suc.strip_path( file_path )
- latest_version_of_file = get_latest_tool_config_revision_from_repository_manifest( repo, file_name, changeset_revision )
- can_use_disk_file = filecmp.cmp( file_path, latest_version_of_file )
- try:
- os.unlink( latest_version_of_file )
- except:
- pass
- return can_use_disk_file
def changeset_is_malicious( trans, id, changeset_revision, **kwd ):
"""Check the malicious flag in repository metadata for a specified change set"""
repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans, id, changeset_revision )
@@ -155,15 +136,6 @@
if user_email in admin_users:
return True
return False
-def get_absolute_path_to_file_in_repository( repo_files_dir, file_name ):
- stripped_file_name = suc.strip_path( file_name )
- file_path = None
- for root, dirs, files in os.walk( repo_files_dir ):
- if root.find( '.hg' ) < 0:
- for name in files:
- if name == stripped_file_name:
- return os.path.abspath( os.path.join( root, name ) )
- return file_path
def get_category( trans, id ):
"""Get a category from the database"""
return trans.sa_session.query( trans.model.Category ).get( trans.security.decode_id( id ) )
@@ -249,9 +221,6 @@
def get_repository_by_name( trans, name ):
"""Get a repository from the database via name"""
return trans.sa_session.query( trans.model.Repository ).filter_by( name=name ).one()
-def get_repository_metadata_by_id( trans, id ):
- """Get repository metadata from the database"""
- return trans.sa_session.query( trans.model.RepositoryMetadata ).get( trans.security.decode_id( id ) )
def get_repository_metadata_revisions_for_review( repository, reviewed=True ):
repository_metadata_revisions = []
metadata_changeset_revision_hashes = []
@@ -425,46 +394,6 @@
if previous_changeset_revision in reviewed_revision_hashes:
return True
return False
-def load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config_filename ):
- """
- Return a loaded tool whose tool config file name (e.g., filtering.xml) is the value of tool_config_filename. The value of changeset_revision
- is a valid (downloadable) changset revision. The tool config will be located in the repository manifest between the received valid changeset
- revision and the first changeset revision in the repository, searching backwards.
- """
- original_tool_data_path = trans.app.config.tool_data_path
- repository = suc.get_repository_in_tool_shed( trans, repository_id )
- repo_files_dir = repository.repo_path( trans.app )
- repo = hg.repository( suc.get_configured_ui(), repo_files_dir )
- message = ''
- tool = None
- can_use_disk_file = False
- tool_config_filepath = get_absolute_path_to_file_in_repository( repo_files_dir, tool_config_filename )
- work_dir = tempfile.mkdtemp()
- can_use_disk_file = can_use_tool_config_disk_file( trans, repository, repo, tool_config_filepath, changeset_revision )
- if can_use_disk_file:
- trans.app.config.tool_data_path = work_dir
- tool, valid, message, sample_files = suc.handle_sample_files_and_load_tool_from_disk( trans, repo_files_dir, tool_config_filepath, work_dir )
- if tool is not None:
- invalid_files_and_errors_tups = suc.check_tool_input_params( trans.app,
- repo_files_dir,
- tool_config_filename,
- tool,
- sample_files )
- if invalid_files_and_errors_tups:
- message2 = suc.generate_message_for_invalid_tools( trans,
- invalid_files_and_errors_tups,
- repository,
- metadata_dict=None,
- as_html=True,
- displaying_invalid_tool=True )
- message = suc.concat_messages( message, message2 )
- else:
- tool, message, sample_files = suc.handle_sample_files_and_load_tool_from_tmp_config( trans, repo, changeset_revision, tool_config_filename, work_dir )
- suc.remove_dir( work_dir )
- trans.app.config.tool_data_path = original_tool_data_path
- # Reset the tool_data_tables by loading the empty tool_data_table_conf.xml file.
- suc.reset_tool_data_tables( trans.app )
- return repository, tool, message
def new_repository_dependency_metadata_required( trans, repository, metadata_dict ):
"""
Compare the last saved metadata for each repository dependency in the repository with the new
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -1,18 +1,20 @@
-import os, logging, tempfile, shutil, ConfigParser
+import os, logging, re, tempfile, shutil, ConfigParser
from time import gmtime, strftime
from datetime import date, datetime
-from galaxy import util
+from galaxy import util, web
from galaxy.util.odict import odict
-from galaxy.web.base.controller import *
-from galaxy.web.form_builder import CheckboxField
+from galaxy.web.base.controller import BaseUIController
+from galaxy.web.form_builder import CheckboxField, SelectField, build_select_field
from galaxy.webapps.community import model
from galaxy.webapps.community.model import directory_hash_id
-from galaxy.web.framework.helpers import time_ago, iff, grids
-from galaxy.util.json import from_json_string, to_json_string
-from galaxy.model.orm import and_
+from galaxy.web.framework.helpers import grids
+from galaxy.util import json
+from galaxy.model.orm import and_, or_
import galaxy.util.shed_util_common as suc
from galaxy.tool_shed import encoding_util
+from galaxy.webapps.community.util import workflow_util
import common
+import galaxy.tools
from galaxy import eggs
eggs.require('mercurial')
@@ -157,7 +159,7 @@
model.User.table.c.email == column_filter ) )
class EmailAlertsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository ):
- if trans.user and repository.email_alerts and trans.user.email in from_json_string( repository.email_alerts ):
+ if trans.user and repository.email_alerts and trans.user.email in json.from_json_string( repository.email_alerts ):
return 'yes'
return ''
class DeprecatedColumn( grids.TextColumn ):
@@ -833,7 +835,7 @@
# Start building up the url to redirect back to the calling Galaxy instance.
url = suc.url_join( galaxy_url,
'admin_toolshed/update_to_changeset_revision?tool_shed_url=%s&name=%s&owner=%s&changeset_revision=%s&latest_changeset_revision=' % \
- ( url_for( '/', qualified=True ), repository.name, repository.user.username, changeset_revision ) )
+ ( web.url_for( '/', qualified=True ), repository.name, repository.user.username, changeset_revision ) )
if changeset_revision == repository.tip( trans.app ):
# If changeset_revision is the repository tip, there are no additional updates.
if from_update_manager:
@@ -1020,7 +1022,7 @@
params = util.Params( kwd )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
- repository, tool, message = common.load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
+ repository, tool, message = suc.load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
if message:
status = 'error'
tool_state = self.__new_state( trans )
@@ -1087,7 +1089,7 @@
is_admin = trans.user_is_admin()
if operation == "view_or_manage_repository":
# The received id is a RepositoryMetadata id, so we have to get the repository id.
- repository_metadata = common.get_repository_metadata_by_id( trans, item_id )
+ repository_metadata = suc.get_repository_metadata_by_id( trans, item_id )
repository_id = trans.security.encode_id( repository_metadata.repository.id )
repository = suc.get_repository_in_tool_shed( trans, repository_id )
kwd[ 'id' ] = repository_id
@@ -1104,7 +1106,7 @@
encoded_repository_ids = []
changeset_revisions = []
for repository_metadata_id in util.listify( item_id ):
- repository_metadata = common.get_repository_metadata_by_id( trans, repository_metadata_id )
+ repository_metadata = suc.get_repository_metadata_by_id( trans, repository_metadata_id )
encoded_repository_ids.append( trans.security.encode_id( repository_metadata.repository.id ) )
changeset_revisions.append( repository_metadata.changeset_revision )
new_kwd[ 'repository_ids' ] = encoded_repository_ids
@@ -1172,7 +1174,7 @@
is_admin = trans.user_is_admin()
if operation == "view_or_manage_repository":
# The received id is a RepositoryMetadata id, so we have to get the repository id.
- repository_metadata = common.get_repository_metadata_by_id( trans, item_id )
+ repository_metadata = suc.get_repository_metadata_by_id( trans, item_id )
repository_id = trans.security.encode_id( repository_metadata.repository.id )
repository = suc.get_repository_in_tool_shed( trans, repository_id )
kwd[ 'id' ] = repository_id
@@ -1189,7 +1191,7 @@
encoded_repository_ids = []
changeset_revisions = []
for repository_metadata_id in util.listify( item_id ):
- repository_metadata = common.get_repository_metadata_by_id( trans, item_id )
+ repository_metadata = suc.get_repository_metadata_by_id( trans, item_id )
encoded_repository_ids.append( trans.security.encode_id( repository_metadata.repository.id ) )
changeset_revisions.append( repository_metadata.changeset_revision )
new_kwd = {}
@@ -1245,6 +1247,10 @@
message=message,
status=status )
@web.expose
+ def generate_workflow_image( self, trans, workflow_name, repository_metadata_id=None ):
+ """Return an svg image representation of a workflow dictionary created when the workflow was exported."""
+ return workflow_util.generate_workflow_image( trans, workflow_name, repository_metadata_id=repository_metadata_id, repository_id=None )
+ @web.expose
def get_changeset_revision_and_ctx_rev( self, trans, **kwd ):
"""Handle a request from a local Galaxy instance to retrieve the changeset revision hash to which an installed repository can be updated."""
params = util.Params( kwd )
@@ -1355,7 +1361,7 @@
repository_dependencies = suc.get_repository_dependencies_for_changeset_revision( trans=trans,
repository=repository,
repository_metadata=repository_metadata,
- toolshed_base_url=str( url_for( '/', qualified=True ) ).rstrip( '/' ),
+ toolshed_base_url=str( web.url_for( '/', qualified=True ) ).rstrip( '/' ),
key_rd_dicts_to_be_processed=None,
all_repository_dependencies=None,
handled_key_rd_dicts=None,
@@ -1418,7 +1424,7 @@
encoded_repository_ids.append( trans.security.encode_id( repository.id ) )
changeset_revisions.append( changeset_revision )
if encoded_repository_ids and changeset_revisions:
- repo_info_dict = from_json_string( self.get_repository_information( trans, encoded_repository_ids, changeset_revisions ) )
+ repo_info_dict = json.from_json_string( self.get_repository_information( trans, encoded_repository_ids, changeset_revisions ) )
else:
repo_info_dict = {}
return repo_info_dict
@@ -1465,7 +1471,7 @@
if current_changeset_revision == changeset_revision:
break
if tool_version_dicts:
- return to_json_string( tool_version_dicts )
+ return json.to_json_string( tool_version_dicts )
return ''
def get_versions_of_tool( self, trans, repository, repository_metadata, guid ):
"""Return the tool lineage in descendant order for the received guid contained in the received repsitory_metadata.tool_versions."""
@@ -1595,14 +1601,14 @@
# Redirect back to local Galaxy to perform install.
url = suc.url_join( galaxy_url,
'admin_toolshed/prepare_for_install?tool_shed_url=%s&repository_ids=%s&changeset_revisions=%s' % \
- ( url_for( '/', qualified=True ), ','.join( util.listify( repository_ids ) ), ','.join( util.listify( changeset_revisions ) ) ) )
+ ( web.url_for( '/', qualified=True ), ','.join( util.listify( repository_ids ) ), ','.join( util.listify( changeset_revisions ) ) ) )
return trans.response.send_redirect( url )
@web.expose
def load_invalid_tool( self, trans, repository_id, tool_config, changeset_revision, **kwd ):
params = util.Params( kwd )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'error' )
- repository, tool, error_message = common.load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
+ repository, tool, error_message = suc.load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
tool_state = self.__new_state( trans )
is_malicious = common.changeset_is_malicious( trans, repository_id, repository.tip( trans.app ) )
invalid_file_tups = []
@@ -1701,7 +1707,7 @@
alerts_checked = CheckboxField.is_checked( alerts )
category_ids = util.listify( params.get( 'category_id', '' ) )
if repository.email_alerts:
- email_alerts = from_json_string( repository.email_alerts )
+ email_alerts = json.from_json_string( repository.email_alerts )
else:
email_alerts = []
allow_push = params.get( 'allow_push', '' )
@@ -1776,12 +1782,12 @@
if alerts_checked:
if user.email not in email_alerts:
email_alerts.append( user.email )
- repository.email_alerts = to_json_string( email_alerts )
+ repository.email_alerts = json.to_json_string( email_alerts )
flush_needed = True
else:
if user.email in email_alerts:
email_alerts.remove( user.email )
- repository.email_alerts = to_json_string( email_alerts )
+ repository.email_alerts = json.to_json_string( email_alerts )
flush_needed = True
if flush_needed:
trans.sa_session.add( repository )
@@ -1830,7 +1836,7 @@
repository_dependencies = suc.get_repository_dependencies_for_changeset_revision( trans=trans,
repository=repository,
repository_metadata=repository_metadata,
- toolshed_base_url=str( url_for( '/', qualified=True ) ).rstrip( '/' ),
+ toolshed_base_url=str( web.url_for( '/', qualified=True ) ).rstrip( '/' ),
key_rd_dicts_to_be_processed=None,
all_repository_dependencies=None,
handled_key_rd_dicts=None )
@@ -1913,7 +1919,7 @@
Only inputs on the first page will be initialized unless `all_pages` is
True, in which case all inputs regardless of page are initialized.
"""
- state = DefaultToolState()
+ state = galaxy.tools.DefaultToolState()
state.inputs = {}
return state
@web.json
@@ -1939,7 +1945,7 @@
repository_dependencies = suc.get_repository_dependencies_for_changeset_revision( trans=trans,
repository=repository,
repository_metadata=repository_metadata,
- toolshed_base_url=str( url_for( '/', qualified=True ) ).rstrip( '/' ),
+ toolshed_base_url=str( web.url_for( '/', qualified=True ) ).rstrip( '/' ),
key_rd_dicts_to_be_processed=None,
all_repository_dependencies=None,
handled_key_rd_dicts=None )
@@ -2257,18 +2263,18 @@
for repository_id in repository_ids:
repository = suc.get_repository_in_tool_shed( trans, repository_id )
if repository.email_alerts:
- email_alerts = from_json_string( repository.email_alerts )
+ email_alerts = json.from_json_string( repository.email_alerts )
else:
email_alerts = []
if user.email in email_alerts:
email_alerts.remove( user.email )
- repository.email_alerts = to_json_string( email_alerts )
+ repository.email_alerts = json.to_json_string( email_alerts )
trans.sa_session.add( repository )
flush_needed = True
total_alerts_removed += 1
else:
email_alerts.append( user.email )
- repository.email_alerts = to_json_string( email_alerts )
+ repository.email_alerts = json.to_json_string( email_alerts )
trans.sa_session.add( repository )
flush_needed = True
total_alerts_added += 1
@@ -2424,7 +2430,7 @@
alerts = params.get( 'alerts', '' )
alerts_checked = CheckboxField.is_checked( alerts )
if repository.email_alerts:
- email_alerts = from_json_string( repository.email_alerts )
+ email_alerts = json.from_json_string( repository.email_alerts )
else:
email_alerts = []
repository_dependencies = None
@@ -2434,12 +2440,12 @@
if alerts_checked:
if user.email not in email_alerts:
email_alerts.append( user.email )
- repository.email_alerts = to_json_string( email_alerts )
+ repository.email_alerts = json.to_json_string( email_alerts )
flush_needed = True
else:
if user.email in email_alerts:
email_alerts.remove( user.email )
- repository.email_alerts = to_json_string( email_alerts )
+ repository.email_alerts = json.to_json_string( email_alerts )
flush_needed = True
if flush_needed:
trans.sa_session.add( repository )
@@ -2460,7 +2466,7 @@
repository_dependencies = suc.get_repository_dependencies_for_changeset_revision( trans=trans,
repository=repository,
repository_metadata=repository_metadata,
- toolshed_base_url=str( url_for( '/', qualified=True ) ).rstrip( '/' ),
+ toolshed_base_url=str( web.url_for( '/', qualified=True ) ).rstrip( '/' ),
key_rd_dicts_to_be_processed=None,
all_repository_dependencies=None,
handled_key_rd_dicts=None )
@@ -2527,7 +2533,7 @@
guid = tool_metadata_dict[ 'guid' ]
full_path_to_tool_config = os.path.abspath( relative_path_to_tool_config )
full_path_to_dir, tool_config_filename = os.path.split( full_path_to_tool_config )
- can_use_disk_file = common.can_use_tool_config_disk_file( trans, repository, repo, full_path_to_tool_config, changeset_revision )
+ can_use_disk_file = suc.can_use_tool_config_disk_file( trans, repository, repo, full_path_to_tool_config, changeset_revision )
if can_use_disk_file:
trans.app.config.tool_data_path = work_dir
tool, valid, message, sample_files = suc.handle_sample_files_and_load_tool_from_disk( trans,
@@ -2576,7 +2582,26 @@
review_id=review_id,
message=message,
status=status )
-
+ @web.expose
+ def view_workflow( self, trans, workflow_name, repository_metadata_id, **kwd ):
+ """Retrieve necessary information about a workflow from the database so that it can be displayed in an svg image."""
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ if workflow_name:
+ workflow_name = encoding_util.tool_shed_decode( workflow_name )
+ repository_metadata = suc.get_repository_metadata_by_id( trans, repository_metadata_id )
+ repository = suc.get_repository_in_tool_shed( trans, trans.security.encode_id( repository_metadata.repository_id ) )
+ changeset_revision = repository_metadata.changeset_revision
+ metadata = repository_metadata.metadata
+ return trans.fill_template( "/webapps/community/repository/view_workflow.mako",
+ repository=repository,
+ changeset_revision=changeset_revision,
+ repository_metadata_id=repository_metadata_id,
+ workflow_name=workflow_name,
+ metadata=metadata,
+ message=message,
+ status=status )
# ----- Utility methods -----
def build_changeset_revision_select_field( trans, repository, selected_value=None, add_id_to_name=True,
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/webapps/community/controllers/workflow.py
--- a/lib/galaxy/webapps/community/controllers/workflow.py
+++ /dev/null
@@ -1,411 +0,0 @@
-import pkg_resources
-pkg_resources.require( "simplejson" )
-pkg_resources.require( "SVGFig" )
-import os, logging, ConfigParser, tempfile, shutil, svgfig
-from galaxy.webapps.community import model
-from galaxy.web.framework.helpers import time_ago, iff, grids
-from galaxy.util.json import from_json_string, to_json_string
-from galaxy.workflow.modules import InputDataModule, ToolModule, WorkflowModuleFactory
-from galaxy.web.base.controller import *
-from galaxy.tools import DefaultToolState
-from galaxy.webapps.galaxy.controllers.workflow import attach_ordered_steps
-import common
-import galaxy.util.shed_util_common as suc
-from galaxy.tool_shed import encoding_util
-
-class RepoInputDataModule( InputDataModule ):
-
- type = "data_input"
- name = "Input dataset"
-
- @classmethod
- def new( Class, trans, tools_metadata=None, tool_id=None ):
- module = Class( trans )
- module.state = dict( name="Input Dataset" )
- return module
- @classmethod
- def from_dict( Class, trans, repository_id, changeset_revision, step_dict, tools_metadata=None, secure=True ):
- module = Class( trans )
- state = from_json_string( step_dict[ "tool_state" ] )
- module.state = dict( name=state.get( "name", "Input Dataset" ) )
- return module
- @classmethod
- def from_workflow_step( Class, trans, repository_id, changeset_revision, tools_metadata, step ):
- module = Class( trans )
- module.state = dict( name="Input Dataset" )
- if step.tool_inputs and "name" in step.tool_inputs:
- module.state[ 'name' ] = step.tool_inputs[ 'name' ]
- return module
-
-class RepoToolModule( ToolModule ):
-
- type = "tool"
-
- def __init__( self, trans, repository_id, changeset_revision, tools_metadata, tool_id ):
- self.trans = trans
- self.tools_metadata = tools_metadata
- self.tool_id = tool_id
- self.tool = None
- self.errors = None
- for tool_dict in tools_metadata:
- if self.tool_id in [ tool_dict[ 'id' ], tool_dict[ 'guid' ] ]:
- repository, self.tool, message = common.load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_dict[ 'tool_config' ] )
- if message and self.tool is None:
- self.errors = 'unavailable'
- break
- self.post_job_actions = {}
- self.workflow_outputs = []
- self.state = None
- @classmethod
- def new( Class, trans, repository_id, changeset_revision, tools_metadata, tool_id=None ):
- module = Class( trans, repository_id, changeset_revision, tools_metadata, tool_id )
- module.state = module.tool.new_state( trans, all_pages=True )
- return module
- @classmethod
- def from_dict( Class, trans, repository_id, changeset_revision, step_dict, tools_metadata, secure=True ):
- tool_id = step_dict[ 'tool_id' ]
- module = Class( trans, repository_id, changeset_revision, tools_metadata, tool_id )
- module.state = DefaultToolState()
- if module.tool is not None:
- module.state.decode( step_dict[ "tool_state" ], module.tool, module.trans.app, secure=secure )
- module.errors = step_dict.get( "tool_errors", None )
- return module
- @classmethod
- def from_workflow_step( Class, trans, repository_id, changeset_revision, tools_metadata, step ):
- module = Class( trans, repository_id, changeset_revision, tools_metadata, step.tool_id )
- module.state = DefaultToolState()
- if module.tool:
- module.state.inputs = module.tool.params_from_strings( step.tool_inputs, trans.app, ignore_errors=True )
- else:
- module.state.inputs = {}
- module.errors = step.tool_errors
- return module
- def get_data_inputs( self ):
- data_inputs = []
- def callback( input, value, prefixed_name, prefixed_label ):
- if isinstance( input, DataToolParameter ):
- data_inputs.append( dict( name=prefixed_name,
- label=prefixed_label,
- extensions=input.extensions ) )
- if self.tool:
- visit_input_values( self.tool.inputs, self.state.inputs, callback )
- return data_inputs
- def get_data_outputs( self ):
- data_outputs = []
- if self.tool:
- data_inputs = None
- for name, tool_output in self.tool.outputs.iteritems():
- if tool_output.format_source != None:
- # Default to special name "input" which remove restrictions on connections
- formats = [ 'input' ]
- if data_inputs == None:
- data_inputs = self.get_data_inputs()
- # Find the input parameter referenced by format_source
- for di in data_inputs:
- # Input names come prefixed with conditional and repeat names separated by '|',
- # so remove prefixes when comparing with format_source.
- if di[ 'name' ] != None and di[ 'name' ].split( '|' )[ -1 ] == tool_output.format_source:
- formats = di[ 'extensions' ]
- else:
- formats = [ tool_output.format ]
- for change_elem in tool_output.change_format:
- for when_elem in change_elem.findall( 'when' ):
- format = when_elem.get( 'format', None )
- if format and format not in formats:
- formats.append( format )
- data_outputs.append( dict( name=name, extensions=formats ) )
- return data_outputs
-
-class RepoWorkflowModuleFactory( WorkflowModuleFactory ):
- def __init__( self, module_types ):
- self.module_types = module_types
- def new( self, trans, type, tools_metadata=None, tool_id=None ):
- """Return module for type and (optional) tool_id initialized with new / default state."""
- assert type in self.module_types
- return self.module_types[type].new( trans, tool_id )
- def from_dict( self, trans, repository_id, changeset_revision, step_dict, **kwd ):
- """Return module initialized from the data in dictionary `step_dict`."""
- type = step_dict[ 'type' ]
- assert type in self.module_types
- return self.module_types[ type ].from_dict( trans, repository_id, changeset_revision, step_dict, **kwd )
- def from_workflow_step( self, trans, repository_id, changeset_revision, tools_metadata, step ):
- """Return module initialized from the WorkflowStep object `step`."""
- type = step.type
- return self.module_types[ type ].from_workflow_step( trans, repository_id, changeset_revision, tools_metadata, step )
-
-module_factory = RepoWorkflowModuleFactory( dict( data_input=RepoInputDataModule, tool=RepoToolModule ) )
-
-class WorkflowController( BaseUIController ):
- @web.expose
- def view_workflow( self, trans, **kwd ):
- repository_metadata_id = kwd.get( 'repository_metadata_id', '' )
- workflow_name = kwd.get( 'workflow_name', '' )
- if workflow_name:
- workflow_name = encoding_util.tool_shed_decode( workflow_name )
- message = kwd.get( 'message', '' )
- status = kwd.get( 'status', 'done' )
- repository_metadata = common.get_repository_metadata_by_id( trans, repository_metadata_id )
- repository = suc.get_repository_in_tool_shed( trans, trans.security.encode_id( repository_metadata.repository_id ) )
- return trans.fill_template( "/webapps/community/repository/view_workflow.mako",
- repository=repository,
- changeset_revision=repository_metadata.changeset_revision,
- repository_metadata_id=repository_metadata_id,
- workflow_name=workflow_name,
- metadata=repository_metadata.metadata,
- message=message,
- status=status )
- @web.expose
- def generate_workflow_image( self, trans, repository_metadata_id, workflow_name ):
- repository_metadata = common.get_repository_metadata_by_id( trans, repository_metadata_id )
- repository_id = trans.security.encode_id( repository_metadata.repository_id )
- changeset_revision = repository_metadata.changeset_revision
- metadata = repository_metadata.metadata
- workflow_name = encoding_util.tool_shed_decode( workflow_name )
- # metadata[ 'workflows' ] is a list of tuples where each contained tuple is
- # [ <relative path to the .ga file in the repository>, <exported workflow dict> ]
- for workflow_tup in metadata[ 'workflows' ]:
- workflow_dict = workflow_tup[1]
- if workflow_dict[ 'name' ] == workflow_name:
- break
- if 'tools' in metadata:
- tools_metadata = metadata[ 'tools' ]
- else:
- tools_metadata = []
- workflow, missing_tool_tups = self.__workflow_from_dict( trans, workflow_dict, tools_metadata, repository_id, changeset_revision )
- data = []
- canvas = svgfig.canvas( style="stroke:black; fill:none; stroke-width:1px; stroke-linejoin:round; text-anchor:left" )
- text = svgfig.SVG( "g" )
- connectors = svgfig.SVG( "g" )
- boxes = svgfig.SVG( "g" )
- svgfig.Text.defaults[ "font-size" ] = "10px"
- in_pos = {}
- out_pos = {}
- margin = 5
- # Spacing between input/outputs.
- line_px = 16
- # Store px width for boxes of each step.
- widths = {}
- max_width, max_x, max_y = 0, 0, 0
- for step in workflow.steps:
- step.upgrade_messages = {}
- module = module_factory.from_workflow_step( trans, repository_id, changeset_revision, tools_metadata, step )
- tool_errors = module.type == 'tool' and not module.tool
- module_data_inputs = self.__get_data_inputs( step, module )
- module_data_outputs = self.__get_data_outputs( step, module, workflow.steps )
- step_dict = {
- 'id' : step.order_index,
- 'data_inputs' : module_data_inputs,
- 'data_outputs' : module_data_outputs,
- 'position' : step.position,
- 'tool_errors' : tool_errors
- }
- input_conn_dict = {}
- for conn in step.input_connections:
- input_conn_dict[ conn.input_name ] = dict( id=conn.output_step.order_index, output_name=conn.output_name )
- step_dict[ 'input_connections' ] = input_conn_dict
- data.append( step_dict )
- x, y = step.position[ 'left' ], step.position[ 'top' ]
- count = 0
- module_name = self.__get_name( module, missing_tool_tups )
- max_len = len( module_name ) * 1.5
- text.append( svgfig.Text( x, y + 20, module_name, **{ "font-size": "14px" } ).SVG() )
- y += 45
- for di in module_data_inputs:
- cur_y = y + count * line_px
- if step.order_index not in in_pos:
- in_pos[ step.order_index ] = {}
- in_pos[ step.order_index ][ di[ 'name' ] ] = ( x, cur_y )
- text.append( svgfig.Text( x, cur_y, di[ 'label' ] ).SVG() )
- count += 1
- max_len = max( max_len, len( di[ 'label' ] ) )
- if len( module.get_data_inputs() ) > 0:
- y += 15
- for do in module_data_outputs:
- cur_y = y + count * line_px
- if step.order_index not in out_pos:
- out_pos[ step.order_index ] = {}
- out_pos[ step.order_index ][ do[ 'name' ] ] = ( x, cur_y )
- text.append( svgfig.Text( x, cur_y, do[ 'name' ] ).SVG() )
- count += 1
- max_len = max( max_len, len( do['name' ] ) )
- widths[ step.order_index ] = max_len * 5.5
- max_x = max( max_x, step.position[ 'left' ] )
- max_y = max( max_y, step.position[ 'top' ] )
- max_width = max( max_width, widths[ step.order_index ] )
- for step_dict in data:
- tool_unavailable = step_dict[ 'tool_errors' ]
- width = widths[ step_dict[ 'id' ] ]
- x, y = step_dict[ 'position' ][ 'left' ], step_dict[ 'position' ][ 'top' ]
- if tool_unavailable:
- fill = "#EBBCB2"
- else:
- fill = "#EBD9B2"
- boxes.append( svgfig.Rect( x - margin, y, x + width - margin, y + 30, fill=fill ).SVG() )
- box_height = ( len( step_dict[ 'data_inputs' ] ) + len( step_dict[ 'data_outputs' ] ) ) * line_px + margin
- # Draw separator line.
- if len( step_dict[ 'data_inputs' ] ) > 0:
- box_height += 15
- sep_y = y + len( step_dict[ 'data_inputs' ] ) * line_px + 40
- text.append( svgfig.Line( x - margin, sep_y, x + width - margin, sep_y ).SVG() )
- # Define an input/output box.
- boxes.append( svgfig.Rect( x - margin, y + 30, x + width - margin, y + 30 + box_height, fill="#ffffff" ).SVG() )
- for conn, output_dict in step_dict[ 'input_connections' ].iteritems():
- in_coords = in_pos[ step_dict[ 'id' ] ][ conn ]
- # out_pos_index will be a step number like 1, 2, 3...
- out_pos_index = output_dict[ 'id' ]
- # out_pos_name will be a string like 'o', 'o2', etc.
- out_pos_name = output_dict[ 'output_name' ]
- if out_pos_index in out_pos:
- # out_conn_index_dict will be something like:
- # 7: {'o': (824.5, 618)}
- out_conn_index_dict = out_pos[ out_pos_index ]
- if out_pos_name in out_conn_index_dict:
- out_conn_pos = out_pos[ out_pos_index ][ out_pos_name ]
- else:
- # Take any key / value pair available in out_conn_index_dict.
- # A problem will result if the dictionary is empty.
- if out_conn_index_dict.keys():
- key = out_conn_index_dict.keys()[0]
- out_conn_pos = out_pos[ out_pos_index ][ key ]
- adjusted = ( out_conn_pos[ 0 ] + widths[ output_dict[ 'id' ] ], out_conn_pos[ 1 ] )
- text.append( svgfig.SVG( "circle",
- cx=out_conn_pos[ 0 ] + widths[ output_dict[ 'id' ] ] - margin,
- cy=out_conn_pos[ 1 ] - margin,
- r = 5,
- fill="#ffffff" ) )
- connectors.append( svgfig.Line( adjusted[ 0 ],
- adjusted[ 1 ] - margin,
- in_coords[ 0 ] - 10,
- in_coords[ 1 ],
- arrow_end = "true" ).SVG() )
- canvas.append( connectors )
- canvas.append( boxes )
- canvas.append( text )
- width, height = ( max_x + max_width + 50 ), max_y + 300
- canvas[ 'width' ] = "%s px" % width
- canvas[ 'height' ] = "%s px" % height
- canvas[ 'viewBox' ] = "0 0 %s %s" % ( width, height )
- trans.response.set_content_type( "image/svg+xml" )
- return canvas.standalone_xml()
- def __get_name( self, module, missing_tool_tups ):
- module_name = module.get_name()
- if module.type == 'tool' and module_name == 'unavailable':
- for missing_tool_tup in missing_tool_tups:
- missing_tool_id, missing_tool_name, missing_tool_version = missing_tool_tup
- if missing_tool_id == module.tool_id:
- module_name = '%s' % missing_tool_name
- return module_name
- def __get_data_inputs( self, step, module ):
- if module.type == 'tool':
- if module.tool:
- return module.get_data_inputs()
- else:
- data_inputs = []
- for wfsc in step.input_connections:
- data_inputs_dict = {}
- data_inputs_dict[ 'extensions' ] = [ '' ]
- data_inputs_dict[ 'name' ] = wfsc.input_name
- data_inputs_dict[ 'label' ] = 'Unknown'
- data_inputs.append( data_inputs_dict )
- return data_inputs
- return module.get_data_inputs()
- def __get_data_outputs( self, step, module, steps ):
- if module.type == 'tool':
- if module.tool:
- return module.get_data_outputs()
- else:
- data_outputs = []
- data_outputs_dict = {}
- data_outputs_dict[ 'extensions' ] = [ 'input' ]
- found = False
- for workflow_step in steps:
- for wfsc in workflow_step.input_connections:
- if step.name == wfsc.output_step.name:
- data_outputs_dict[ 'name' ] = wfsc.output_name
- found = True
- break
- if found:
- break
- if not found:
- # We're at the last step of the workflow.
- data_outputs_dict[ 'name' ] = 'output'
- data_outputs.append( data_outputs_dict )
- return data_outputs
- return module.get_data_outputs()
- def __workflow_from_dict( self, trans, workflow_dict, tools_metadata, repository_id, changeset_revision ):
- """Creates and returns workflow object from a dictionary."""
- trans.workflow_building_mode = True
- workflow = model.Workflow()
- workflow.name = workflow_dict[ 'name' ]
- workflow.has_errors = False
- steps = []
- # Keep ids for each step that we need to use to make connections.
- steps_by_external_id = {}
- # Keep track of tools required by the workflow that are not available in
- # the tool shed repository. Each tuple in the list of missing_tool_tups
- # will be ( tool_id, tool_name, tool_version ).
- missing_tool_tups = []
- # First pass to build step objects and populate basic values
- for key, step_dict in workflow_dict[ 'steps' ].iteritems():
- # Create the model class for the step
- step = model.WorkflowStep()
- step.name = step_dict[ 'name' ]
- step.position = step_dict[ 'position' ]
- module = module_factory.from_dict( trans, repository_id, changeset_revision, step_dict, tools_metadata=tools_metadata, secure=False )
- if module.type == 'tool' and module.tool is None:
- # A required tool is not available in the current repository.
- step.tool_errors = 'unavailable'
- missing_tool_tup = ( step_dict[ 'tool_id' ], step_dict[ 'name' ], step_dict[ 'tool_version' ] )
- if missing_tool_tup not in missing_tool_tups:
- missing_tool_tups.append( missing_tool_tup )
- module.save_to_step( step )
- if step.tool_errors:
- workflow.has_errors = True
- # Stick this in the step temporarily.
- step.temp_input_connections = step_dict[ 'input_connections' ]
- steps.append( step )
- steps_by_external_id[ step_dict[ 'id' ] ] = step
- # Second pass to deal with connections between steps.
- for step in steps:
- # Input connections.
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
- output_step = steps_by_external_id[ conn_dict[ 'id' ] ]
- conn = model.WorkflowStepConnection()
- conn.input_step = step
- conn.input_name = input_name
- conn.output_step = output_step
- conn.output_name = conn_dict[ 'output_name' ]
- step.input_connections.append( conn )
- del step.temp_input_connections
- # Order the steps if possible.
- attach_ordered_steps( workflow, steps )
- return workflow, missing_tool_tups
- @web.expose
- def import_workflow( self, trans, **kwd ):
- repository_metadata_id = kwd.get( 'repository_metadata_id', '' )
- workflow_name = kwd.get( 'workflow_name', '' )
- if workflow_name:
- workflow_name = encoding_util.tool_shed_decode( workflow_name )
- message = kwd.get( 'message', '' )
- status = kwd.get( 'status', 'done' )
- repository_metadata = get_repository_metadata_by_id( trans, repository_metadata_id )
- workflows = repository_metadata.metadata[ 'workflows' ]
- workflow_data = None
- for workflow_data in workflows:
- if workflow_data[ 'name' ] == workflow_name:
- break
- if workflow_data:
- if kwd.get( 'open_for_url', False ):
- tmp_fd, tmp_fname = tempfile.mkstemp()
- to_file = open( tmp_fname, 'wb' )
- to_file.write( to_json_string( workflow_data ) )
- return open( tmp_fname )
- galaxy_url = trans.get_cookie( name='toolshedgalaxyurl' )
- url = '%sworkflow/import_workflow?tool_shed_url=%s&repository_metadata_id=%s&workflow_name=%s' % \
- ( galaxy_url, url_for( '/', qualified=True ), repository_metadata_id, encoding_util.tool_shed_encode( workflow_name ) )
- return trans.response.send_redirect( url )
- return trans.response.send_redirect( web.url_for( controller='workflow',
- action='view_workflow',
- message=message,
- status=status ) )
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/webapps/community/util/container_util.py
--- a/lib/galaxy/webapps/community/util/container_util.py
+++ b/lib/galaxy/webapps/community/util/container_util.py
@@ -56,11 +56,12 @@
class InvalidTool( object ):
"""Invalid tool object"""
- def __init__( self, id=None, tool_config=None, repository_id=None, changeset_revision=None ):
+ def __init__( self, id=None, tool_config=None, repository_id=None, changeset_revision=None, repository_installation_status=None ):
self.id = id
self.tool_config = tool_config
self.repository_id = repository_id
self.changeset_revision = changeset_revision
+ self.repository_installation_status = repository_installation_status
class ReadMe( object ):
"""Readme text object"""
@@ -86,7 +87,7 @@
class Tool( object ):
"""Tool object"""
def __init__( self, id=None, tool_config=None, tool_id=None, name=None, description=None, version=None, requirements=None,
- repository_id=None, changeset_revision=None ):
+ repository_id=None, changeset_revision=None, repository_installation_status=None ):
self.id = id
self.tool_config = tool_config
self.tool_id = tool_id
@@ -96,6 +97,7 @@
self.requirements = requirements
self.repository_id = repository_id
self.changeset_revision = changeset_revision
+ self.repository_installation_status = repository_installation_status
class ToolDependency( object ):
"""Tool dependency object"""
@@ -112,14 +114,17 @@
self.tool_dependency_id = tool_dependency_id
class Workflow( object ):
- """Workflow object"""
- def __init__( self, id=None, repository_metadata_id=None, workflow_name=None, steps=None, format_version=None, annotation=None ):
+ """Workflow object."""
+ def __init__( self, id=None, workflow_name=None, steps=None, format_version=None, annotation=None, repository_metadata_id=None, repository_id=None ):
+ # When rendered in the tool shed, repository_metadata_id will have a value and repository_id will be None. When rendered in Galaxy, repository_id
+ # will have a value and repository_metadata_id will be None.
self.id = id
- self.repository_metadata_id = repository_metadata_id
self.workflow_name = workflow_name
self.steps = steps
self.format_version = format_version
self.annotation = annotation
+ self.repository_metadata_id = repository_metadata_id
+ self.repository_id = repository_id
def build_datatypes_folder( trans, folder_id, datatypes, label='Datatypes' ):
"""Return a folder hierarchy containing datatypes."""
@@ -163,12 +168,18 @@
invalid_tool_id += 1
if repository:
repository_id = repository.id
+ if trans.webapp.name == 'galaxy':
+ repository_installation_status = repository.status
+ else:
+ repository_installation_status = None
else:
repository_id = None
+ repository_installation_status = None
invalid_tool = InvalidTool( id=invalid_tool_id,
tool_config=invalid_tool_config,
repository_id=repository_id,
- changeset_revision=changeset_revision )
+ changeset_revision=changeset_revision,
+ repository_installation_status=repository_installation_status )
folder.invalid_tools.append( invalid_tool )
else:
invalid_tools_root_folder = None
@@ -249,8 +260,13 @@
folder.valid_tools.append( tool )
if repository:
repository_id = repository.id
+ if trans.webapp.name == 'galaxy':
+ repository_installation_status = repository.status
+ else:
+ repository_installation_status = None
else:
- repository_id = ''
+ repository_id = None
+ repository_installation_status = None
for tool_dict in tool_dicts:
tool_id += 1
if 'requirements' in tool_dict:
@@ -269,7 +285,8 @@
version=tool_dict[ 'version' ],
requirements=requirements_str,
repository_id=repository_id,
- changeset_revision=changeset_revision )
+ changeset_revision=changeset_revision,
+ repository_installation_status=repository_installation_status )
folder.valid_tools.append( tool )
else:
tools_root_folder = None
@@ -351,8 +368,13 @@
else:
tool_dependencies_root_folder = None
return folder_id, tool_dependencies_root_folder
-def build_workflows_folder( trans, folder_id, workflows, repository_metadata, label='Workflows' ):
- """Return a folder hierarchy containing invalid tools."""
+def build_workflows_folder( trans, folder_id, workflows, repository_metadata_id=None, repository_id=None, label='Workflows' ):
+ """
+ Return a folder hierarchy containing workflow objects for each workflow dictionary in the received workflows list. When
+ this method is called from the tool shed, repository_metadata_id will have a value and repository_id will be None. When
+ this method is called from Galaxy, repository_id will have a value only if the repository is not currenlty being installed
+ and repository_metadata_id will be None.
+ """
if workflows:
workflow_id = 0
folder_id += 1
@@ -363,11 +385,12 @@
# Insert a header row.
workflow_id += 1
workflow = Workflow( id=workflow_id,
- repository_metadata_id=None,
workflow_name='Name',
steps='steps',
format_version='format-version',
- annotation='annotation' )
+ annotation='annotation',
+ repository_metadata_id=repository_metadata_id,
+ repository_id=repository_id )
folder.workflows.append( workflow )
for workflow_tup in workflows:
workflow_dict=workflow_tup[ 1 ]
@@ -378,11 +401,12 @@
steps = 'unknown'
workflow_id += 1
workflow = Workflow( id=workflow_id,
- repository_metadata_id=repository_metadata.id,
workflow_name=workflow_dict[ 'name' ],
steps=steps,
format_version=workflow_dict[ 'format-version' ],
- annotation=workflow_dict[ 'annotation' ] )
+ annotation=workflow_dict[ 'annotation' ],
+ repository_metadata_id=repository_metadata_id,
+ repository_id=repository_id )
folder.workflows.append( workflow )
else:
workflows_root_folder = None
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/webapps/community/util/workflow_util.py
--- /dev/null
+++ b/lib/galaxy/webapps/community/util/workflow_util.py
@@ -0,0 +1,384 @@
+from galaxy import eggs
+import pkg_resources
+
+pkg_resources.require( "SVGFig" )
+
+import logging, svgfig
+from galaxy.util import json
+import galaxy.util.shed_util_common as suc
+from galaxy.tool_shed import encoding_util
+from galaxy.workflow.modules import InputDataModule, ToolModule, WorkflowModuleFactory
+import galaxy.webapps.galaxy.controllers.workflow
+import galaxy.tools
+import galaxy.tools.parameters
+
+log = logging.getLogger( __name__ )
+
+class RepoInputDataModule( InputDataModule ):
+
+ type = "data_input"
+ name = "Input dataset"
+
+ @classmethod
+ def new( Class, trans, tools_metadata=None, tool_id=None ):
+ module = Class( trans )
+ module.state = dict( name="Input Dataset" )
+ return module
+ @classmethod
+ def from_dict( Class, trans, repository_id, changeset_revision, step_dict, tools_metadata=None, secure=True ):
+ module = Class( trans )
+ state = json.from_json_string( step_dict[ "tool_state" ] )
+ module.state = dict( name=state.get( "name", "Input Dataset" ) )
+ return module
+ @classmethod
+ def from_workflow_step( Class, trans, repository_id, changeset_revision, tools_metadata, step ):
+ module = Class( trans )
+ module.state = dict( name="Input Dataset" )
+ if step.tool_inputs and "name" in step.tool_inputs:
+ module.state[ 'name' ] = step.tool_inputs[ 'name' ]
+ return module
+
+class RepoToolModule( ToolModule ):
+
+ type = "tool"
+
+ def __init__( self, trans, repository_id, changeset_revision, tools_metadata, tool_id ):
+ self.trans = trans
+ self.tools_metadata = tools_metadata
+ self.tool_id = tool_id
+ self.tool = None
+ self.errors = None
+ for tool_dict in tools_metadata:
+ if self.tool_id in [ tool_dict[ 'id' ], tool_dict[ 'guid' ] ]:
+ if trans.webapp.name == 'community':
+ # We're in the tool shed.
+ repository, self.tool, message = suc.load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_dict[ 'tool_config' ] )
+ if message and self.tool is None:
+ self.errors = 'unavailable'
+ break
+ else:
+ # We're in Galaxy.
+ self.tool = trans.app.toolbox.tools_by_id.get( self.tool_id, None )
+ if self.tool is None:
+ self.errors = 'unavailable'
+ self.post_job_actions = {}
+ self.workflow_outputs = []
+ self.state = None
+ @classmethod
+ def new( Class, trans, repository_id, changeset_revision, tools_metadata, tool_id=None ):
+ module = Class( trans, repository_id, changeset_revision, tools_metadata, tool_id )
+ module.state = module.tool.new_state( trans, all_pages=True )
+ return module
+ @classmethod
+ def from_dict( Class, trans, repository_id, changeset_revision, step_dict, tools_metadata, secure=True ):
+ tool_id = step_dict[ 'tool_id' ]
+ module = Class( trans, repository_id, changeset_revision, tools_metadata, tool_id )
+ module.state = galaxy.tools.DefaultToolState()
+ if module.tool is not None:
+ module.state.decode( step_dict[ "tool_state" ], module.tool, module.trans.app, secure=secure )
+ module.errors = step_dict.get( "tool_errors", None )
+ return module
+ @classmethod
+ def from_workflow_step( Class, trans, repository_id, changeset_revision, tools_metadata, step ):
+ module = Class( trans, repository_id, changeset_revision, tools_metadata, step.tool_id )
+ module.state = galaxy.tools.DefaultToolState()
+ if module.tool:
+ module.state.inputs = module.tool.params_from_strings( step.tool_inputs, trans.app, ignore_errors=True )
+ else:
+ module.state.inputs = {}
+ module.errors = step.tool_errors
+ return module
+ def get_data_inputs( self ):
+ data_inputs = []
+ def callback( input, value, prefixed_name, prefixed_label ):
+ if isinstance( input, galaxy.tools.parameters.DataToolParameter ):
+ data_inputs.append( dict( name=prefixed_name,
+ label=prefixed_label,
+ extensions=input.extensions ) )
+ if self.tool:
+ galaxy.tools.parameters.visit_input_values( self.tool.inputs, self.state.inputs, callback )
+ return data_inputs
+ def get_data_outputs( self ):
+ data_outputs = []
+ if self.tool:
+ data_inputs = None
+ for name, tool_output in self.tool.outputs.iteritems():
+ if tool_output.format_source != None:
+ # Default to special name "input" which remove restrictions on connections
+ formats = [ 'input' ]
+ if data_inputs == None:
+ data_inputs = self.get_data_inputs()
+ # Find the input parameter referenced by format_source
+ for di in data_inputs:
+ # Input names come prefixed with conditional and repeat names separated by '|',
+ # so remove prefixes when comparing with format_source.
+ if di[ 'name' ] != None and di[ 'name' ].split( '|' )[ -1 ] == tool_output.format_source:
+ formats = di[ 'extensions' ]
+ else:
+ formats = [ tool_output.format ]
+ for change_elem in tool_output.change_format:
+ for when_elem in change_elem.findall( 'when' ):
+ format = when_elem.get( 'format', None )
+ if format and format not in formats:
+ formats.append( format )
+ data_outputs.append( dict( name=name, extensions=formats ) )
+ return data_outputs
+
+class RepoWorkflowModuleFactory( WorkflowModuleFactory ):
+ def __init__( self, module_types ):
+ self.module_types = module_types
+ def new( self, trans, type, tools_metadata=None, tool_id=None ):
+ """Return module for type and (optional) tool_id initialized with new / default state."""
+ assert type in self.module_types
+ return self.module_types[type].new( trans, tool_id )
+ def from_dict( self, trans, repository_id, changeset_revision, step_dict, **kwd ):
+ """Return module initialized from the data in dictionary `step_dict`."""
+ type = step_dict[ 'type' ]
+ assert type in self.module_types
+ return self.module_types[ type ].from_dict( trans, repository_id, changeset_revision, step_dict, **kwd )
+ def from_workflow_step( self, trans, repository_id, changeset_revision, tools_metadata, step ):
+ """Return module initialized from the WorkflowStep object `step`."""
+ type = step.type
+ return self.module_types[ type ].from_workflow_step( trans, repository_id, changeset_revision, tools_metadata, step )
+
+module_factory = RepoWorkflowModuleFactory( dict( data_input=RepoInputDataModule, tool=RepoToolModule ) )
+
+def generate_workflow_image( trans, workflow_name, repository_metadata_id=None, repository_id=None ):
+ """
+ Return an svg image representation of a workflow dictionary created when the workflow was exported. This method is called
+ from both Galaxy and the tool shed. When called from the tool shed, repository_metadata_id will have a value and repository_id
+ will be None. When called from Galaxy, repository_metadata_id will be None and repository_id will have a value.
+ """
+ workflow_name = encoding_util.tool_shed_decode( workflow_name )
+ if trans.webapp.name == 'community':
+ # We're in the tool shed.
+ repository_metadata = suc.get_repository_metadata_by_id( trans, repository_metadata_id )
+ repository_id = trans.security.encode_id( repository_metadata.repository_id )
+ changeset_revision = repository_metadata.changeset_revision
+ metadata = repository_metadata.metadata
+ else:
+ # We're in Galaxy.
+ repository = suc.get_tool_shed_repository_by_id( trans, repository_id )
+ changeset_revision = repository.changeset_revision
+ metadata = repository.metadata
+ # metadata[ 'workflows' ] is a list of tuples where each contained tuple is
+ # [ <relative path to the .ga file in the repository>, <exported workflow dict> ]
+ for workflow_tup in metadata[ 'workflows' ]:
+ workflow_dict = workflow_tup[1]
+ if workflow_dict[ 'name' ] == workflow_name:
+ break
+ if 'tools' in metadata:
+ tools_metadata = metadata[ 'tools' ]
+ else:
+ tools_metadata = []
+ workflow, missing_tool_tups = get_workflow_from_dict( trans, workflow_dict, tools_metadata, repository_id, changeset_revision )
+ data = []
+ canvas = svgfig.canvas( style="stroke:black; fill:none; stroke-width:1px; stroke-linejoin:round; text-anchor:left" )
+ text = svgfig.SVG( "g" )
+ connectors = svgfig.SVG( "g" )
+ boxes = svgfig.SVG( "g" )
+ svgfig.Text.defaults[ "font-size" ] = "10px"
+ in_pos = {}
+ out_pos = {}
+ margin = 5
+ # Spacing between input/outputs.
+ line_px = 16
+ # Store px width for boxes of each step.
+ widths = {}
+ max_width, max_x, max_y = 0, 0, 0
+ for step in workflow.steps:
+ step.upgrade_messages = {}
+ module = module_factory.from_workflow_step( trans, repository_id, changeset_revision, tools_metadata, step )
+ tool_errors = module.type == 'tool' and not module.tool
+ module_data_inputs = get_workflow_data_inputs( step, module )
+ module_data_outputs = get_workflow_data_outputs( step, module, workflow.steps )
+ step_dict = {
+ 'id' : step.order_index,
+ 'data_inputs' : module_data_inputs,
+ 'data_outputs' : module_data_outputs,
+ 'position' : step.position,
+ 'tool_errors' : tool_errors
+ }
+ input_conn_dict = {}
+ for conn in step.input_connections:
+ input_conn_dict[ conn.input_name ] = dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ step_dict[ 'input_connections' ] = input_conn_dict
+ data.append( step_dict )
+ x, y = step.position[ 'left' ], step.position[ 'top' ]
+ count = 0
+ module_name = get_workflow_module_name( module, missing_tool_tups )
+ max_len = len( module_name ) * 1.5
+ text.append( svgfig.Text( x, y + 20, module_name, **{ "font-size": "14px" } ).SVG() )
+ y += 45
+ for di in module_data_inputs:
+ cur_y = y + count * line_px
+ if step.order_index not in in_pos:
+ in_pos[ step.order_index ] = {}
+ in_pos[ step.order_index ][ di[ 'name' ] ] = ( x, cur_y )
+ text.append( svgfig.Text( x, cur_y, di[ 'label' ] ).SVG() )
+ count += 1
+ max_len = max( max_len, len( di[ 'label' ] ) )
+ if len( module.get_data_inputs() ) > 0:
+ y += 15
+ for do in module_data_outputs:
+ cur_y = y + count * line_px
+ if step.order_index not in out_pos:
+ out_pos[ step.order_index ] = {}
+ out_pos[ step.order_index ][ do[ 'name' ] ] = ( x, cur_y )
+ text.append( svgfig.Text( x, cur_y, do[ 'name' ] ).SVG() )
+ count += 1
+ max_len = max( max_len, len( do['name' ] ) )
+ widths[ step.order_index ] = max_len * 5.5
+ max_x = max( max_x, step.position[ 'left' ] )
+ max_y = max( max_y, step.position[ 'top' ] )
+ max_width = max( max_width, widths[ step.order_index ] )
+ for step_dict in data:
+ tool_unavailable = step_dict[ 'tool_errors' ]
+ width = widths[ step_dict[ 'id' ] ]
+ x, y = step_dict[ 'position' ][ 'left' ], step_dict[ 'position' ][ 'top' ]
+ if tool_unavailable:
+ fill = "#EBBCB2"
+ else:
+ fill = "#EBD9B2"
+ boxes.append( svgfig.Rect( x - margin, y, x + width - margin, y + 30, fill=fill ).SVG() )
+ box_height = ( len( step_dict[ 'data_inputs' ] ) + len( step_dict[ 'data_outputs' ] ) ) * line_px + margin
+ # Draw separator line.
+ if len( step_dict[ 'data_inputs' ] ) > 0:
+ box_height += 15
+ sep_y = y + len( step_dict[ 'data_inputs' ] ) * line_px + 40
+ text.append( svgfig.Line( x - margin, sep_y, x + width - margin, sep_y ).SVG() )
+ # Define an input/output box.
+ boxes.append( svgfig.Rect( x - margin, y + 30, x + width - margin, y + 30 + box_height, fill="#ffffff" ).SVG() )
+ for conn, output_dict in step_dict[ 'input_connections' ].iteritems():
+ in_coords = in_pos[ step_dict[ 'id' ] ][ conn ]
+ # out_pos_index will be a step number like 1, 2, 3...
+ out_pos_index = output_dict[ 'id' ]
+ # out_pos_name will be a string like 'o', 'o2', etc.
+ out_pos_name = output_dict[ 'output_name' ]
+ if out_pos_index in out_pos:
+ # out_conn_index_dict will be something like:
+ # 7: {'o': (824.5, 618)}
+ out_conn_index_dict = out_pos[ out_pos_index ]
+ if out_pos_name in out_conn_index_dict:
+ out_conn_pos = out_pos[ out_pos_index ][ out_pos_name ]
+ else:
+ # Take any key / value pair available in out_conn_index_dict.
+ # A problem will result if the dictionary is empty.
+ if out_conn_index_dict.keys():
+ key = out_conn_index_dict.keys()[0]
+ out_conn_pos = out_pos[ out_pos_index ][ key ]
+ adjusted = ( out_conn_pos[ 0 ] + widths[ output_dict[ 'id' ] ], out_conn_pos[ 1 ] )
+ text.append( svgfig.SVG( "circle",
+ cx=out_conn_pos[ 0 ] + widths[ output_dict[ 'id' ] ] - margin,
+ cy=out_conn_pos[ 1 ] - margin,
+ r = 5,
+ fill="#ffffff" ) )
+ connectors.append( svgfig.Line( adjusted[ 0 ],
+ adjusted[ 1 ] - margin,
+ in_coords[ 0 ] - 10,
+ in_coords[ 1 ],
+ arrow_end = "true" ).SVG() )
+ canvas.append( connectors )
+ canvas.append( boxes )
+ canvas.append( text )
+ width, height = ( max_x + max_width + 50 ), max_y + 300
+ canvas[ 'width' ] = "%s px" % width
+ canvas[ 'height' ] = "%s px" % height
+ canvas[ 'viewBox' ] = "0 0 %s %s" % ( width, height )
+ trans.response.set_content_type( "image/svg+xml" )
+ return canvas.standalone_xml()
+def get_workflow_data_inputs( step, module ):
+ if module.type == 'tool':
+ if module.tool:
+ return module.get_data_inputs()
+ else:
+ data_inputs = []
+ for wfsc in step.input_connections:
+ data_inputs_dict = {}
+ data_inputs_dict[ 'extensions' ] = [ '' ]
+ data_inputs_dict[ 'name' ] = wfsc.input_name
+ data_inputs_dict[ 'label' ] = 'Unknown'
+ data_inputs.append( data_inputs_dict )
+ return data_inputs
+ return module.get_data_inputs()
+def get_workflow_data_outputs( step, module, steps ):
+ if module.type == 'tool':
+ if module.tool:
+ return module.get_data_outputs()
+ else:
+ data_outputs = []
+ data_outputs_dict = {}
+ data_outputs_dict[ 'extensions' ] = [ 'input' ]
+ found = False
+ for workflow_step in steps:
+ for wfsc in workflow_step.input_connections:
+ if step.name == wfsc.output_step.name:
+ data_outputs_dict[ 'name' ] = wfsc.output_name
+ found = True
+ break
+ if found:
+ break
+ if not found:
+ # We're at the last step of the workflow.
+ data_outputs_dict[ 'name' ] = 'output'
+ data_outputs.append( data_outputs_dict )
+ return data_outputs
+ return module.get_data_outputs()
+def get_workflow_from_dict( trans, workflow_dict, tools_metadata, repository_id, changeset_revision ):
+ """Return a workflow object from the dictionary object created when it was exported."""
+ trans.workflow_building_mode = True
+ workflow = trans.model.Workflow()
+ workflow.name = workflow_dict[ 'name' ]
+ workflow.has_errors = False
+ steps = []
+ # Keep ids for each step that we need to use to make connections.
+ steps_by_external_id = {}
+ # Keep track of tools required by the workflow that are not available in
+ # the tool shed repository. Each tuple in the list of missing_tool_tups
+ # will be ( tool_id, tool_name, tool_version ).
+ missing_tool_tups = []
+ # First pass to build step objects and populate basic values
+ for key, step_dict in workflow_dict[ 'steps' ].iteritems():
+ # Create the model class for the step
+ step = trans.model.WorkflowStep()
+ step.name = step_dict[ 'name' ]
+ step.position = step_dict[ 'position' ]
+ module = module_factory.from_dict( trans, repository_id, changeset_revision, step_dict, tools_metadata=tools_metadata, secure=False )
+ if module.type == 'tool' and module.tool is None:
+ # A required tool is not available in the current repository.
+ step.tool_errors = 'unavailable'
+ missing_tool_tup = ( step_dict[ 'tool_id' ], step_dict[ 'name' ], step_dict[ 'tool_version' ] )
+ if missing_tool_tup not in missing_tool_tups:
+ missing_tool_tups.append( missing_tool_tup )
+ module.save_to_step( step )
+ if step.tool_errors:
+ workflow.has_errors = True
+ # Stick this in the step temporarily.
+ step.temp_input_connections = step_dict[ 'input_connections' ]
+ steps.append( step )
+ steps_by_external_id[ step_dict[ 'id' ] ] = step
+ # Second pass to deal with connections between steps.
+ for step in steps:
+ # Input connections.
+ for input_name, conn_dict in step.temp_input_connections.iteritems():
+ if conn_dict:
+ output_step = steps_by_external_id[ conn_dict[ 'id' ] ]
+ conn = trans.model.WorkflowStepConnection()
+ conn.input_step = step
+ conn.input_name = input_name
+ conn.output_step = output_step
+ conn.output_name = conn_dict[ 'output_name' ]
+ step.input_connections.append( conn )
+ del step.temp_input_connections
+ # Order the steps if possible.
+ galaxy.webapps.galaxy.controllers.workflow.attach_ordered_steps( workflow, steps )
+ return workflow, missing_tool_tups
+def get_workflow_module_name( module, missing_tool_tups ):
+ module_name = module.get_name()
+ if module.type == 'tool' and module_name == 'unavailable':
+ for missing_tool_tup in missing_tool_tups:
+ missing_tool_id, missing_tool_name, missing_tool_version = missing_tool_tup
+ if missing_tool_id == module.tool_id:
+ module_name = '%s' % missing_tool_name
+ break
+ return module_name
\ No newline at end of file
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -1,10 +1,10 @@
import urllib2, tempfile
from admin import *
-from galaxy.util.json import from_json_string, to_json_string
+from galaxy.util import json
import galaxy.util.shed_util as shed_util
import galaxy.util.shed_util_common as suc
from galaxy.tool_shed import encoding_util
-from galaxy.webapps.community.util import container_util
+from galaxy.webapps.community.util import container_util, workflow_util
from galaxy import eggs, tools
eggs.require( 'mercurial' )
@@ -552,6 +552,11 @@
galaxy_url = url_for( '/', qualified=True )
url = suc.url_join( tool_shed_url, 'repository/find_workflows?galaxy_url=%s' % galaxy_url )
return trans.response.send_redirect( url )
+ @web.expose
+ @web.require_admin
+ def generate_workflow_image( self, trans, workflow_name, repository_id=None ):
+ """Return an svg image representation of a workflow dictionary created when the workflow was exported."""
+ return workflow_util.generate_workflow_image( trans, workflow_name, repository_metadata_id=None, repository_id=repository_id )
@web.json
@web.require_admin
def get_file_contents( self, trans, file_path ):
@@ -576,7 +581,7 @@
raw_text = response.read()
response.close()
if len( raw_text ) > 2:
- encoded_text = from_json_string( raw_text )
+ encoded_text = json.from_json_string( raw_text )
text = encoding_util.tool_shed_decode( encoded_text )
else:
text = ''
@@ -586,6 +591,71 @@
return tool_version.get_version_ids( app, reverse=True )
@web.expose
@web.require_admin
+ def import_workflow( self, trans, workflow_name, repository_id, **kwd ):
+ # FIXME: importing doesn't yet work...
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ if workflow_name:
+ workflow_name = encoding_util.tool_shed_decode( workflow_name )
+ repository = suc.get_tool_shed_repository_by_id( trans, repository_id )
+ changeset_revision = repository.changeset_revision
+ metadata = repository.metadata
+ workflows = metadata[ 'workflows' ]
+ tools_metadata = metadata[ 'tools' ]
+ workflow_dict = None
+ for workflow_data_tuple in workflows:
+ # The value of workflow_data_tuple is ( relative_path_to_workflow_file, exported_workflow_dict ).
+ relative_path_to_workflow_file, exported_workflow_dict = workflow_data_tuple
+ if exported_workflow_dict[ 'name' ] == workflow_name:
+ # If the exported workflow is available on disk, import it.
+ if os.path.exists( relative_path_to_workflow_file ):
+ workflow_file = open( relative_path_to_workflow_file, 'rb' )
+ workflow_data = workflow_file.read()
+ workflow_file.close()
+ workflow_dict = json.from_json_string( workflow_data )
+ else:
+ # Use the current exported_workflow_dict.
+ workflow_dict = exported_workflow_dict
+ break
+ if workflow_dict:
+ # Create workflow if possible. If a required tool is not available in the local
+ # Galaxy instance, the tool information will be available in the step_dict.
+ src = None
+ workflow, missing_tool_tups = workflow_util.get_workflow_from_dict( trans,
+ workflow_dict,
+ tools_metadata,
+ repository_id,
+ changeset_revision )
+ if workflow_name:
+ workflow.name = workflow_name
+ # Provide user feedback and show workflow list.
+ if workflow.has_errors:
+ message += "Imported, but some steps in this workflow have validation errors. "
+ status = "error"
+ if workflow.has_cycles:
+ message += "Imported, but this workflow contains cycles. "
+ status = "error"
+ else:
+ message += "Workflow <b>%s</b> imported successfully. " % workflow.name
+ if missing_tool_tups:
+ # TODO: rework this since it is used in the tool shed, but shoudn't be used in Galaxy.
+ name_and_id_str = ''
+ for missing_tool_tup in missing_tool_tups:
+ tool_id, tool_name, other = missing_tool_tup
+ name_and_id_str += 'name: %s, id: %s' % ( str( tool_id ), str( tool_name ) )
+ log.debug( "The following tools required by this workflow are missing from this Galaxy instance: %s" % name_and_id_str )
+ else:
+ message += 'The workflow named %s is not included in the metadata for revision %s of repository %s' % \
+ ( str( workflow_name ), str( changeset_revision ), str( repository.name ) )
+ status = 'error'
+ return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
+ action='browse_repository',
+ id=repository_id,
+ message=message,
+ status=status ) )
+ @web.expose
+ @web.require_admin
def initiate_repository_installation( self, trans, shed_repository_ids, encoded_kwd, reinstalling=False ):
tsr_ids = util.listify( shed_repository_ids )
tool_shed_repositories = []
@@ -728,7 +798,7 @@
text = response.read()
response.close()
if text:
- tool_version_dicts = from_json_string( text )
+ tool_version_dicts = json.from_json_string( text )
shed_util.handle_tool_versions( trans.app, tool_version_dicts, tool_shed_repository )
else:
message += "Version information for the tools included in the <b>%s</b> repository is missing. " % name
@@ -1084,7 +1154,7 @@
response = urllib2.urlopen( url )
raw_text = response.read()
response.close()
- repo_information_dict = from_json_string( raw_text )
+ repo_information_dict = json.from_json_string( raw_text )
includes_tools = util.string_as_bool( repo_information_dict.get( 'includes_tools', False ) )
includes_repository_dependencies = util.string_as_bool( repo_information_dict.get( 'includes_repository_dependencies', False ) )
includes_tool_dependencies = util.string_as_bool( repo_information_dict.get( 'includes_tool_dependencies', False ) )
@@ -1191,7 +1261,7 @@
response = urllib2.urlopen( url )
raw_text = response.read()
response.close()
- readme_files_dict = from_json_string( raw_text )
+ readme_files_dict = json.from_json_string( raw_text )
# Since we are installing a new repository, no tool dependencies will be considered "missing". Most of the repository contents
# are set to None since we don't yet know what they are.
containers_dict = suc.build_repository_containers_for_galaxy( trans=trans,
@@ -1593,7 +1663,7 @@
text = response.read()
response.close()
if text:
- tool_version_dicts = from_json_string( text )
+ tool_version_dicts = json.from_json_string( text )
shed_util.handle_tool_versions( trans.app, tool_version_dicts, repository )
message = "Tool versions have been set for all included tools."
status = 'done'
@@ -1785,7 +1855,26 @@
tool_lineage=tool_lineage,
message=message,
status=status )
-
+ @web.expose
+ @web.require_admin
+ def view_workflow( self, trans, workflow_name=None, repository_id=None, **kwd ):
+ """Retrieve necessary information about a workflow from the database so that it can be displayed in an svg image."""
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ if workflow_name:
+ workflow_name = encoding_util.tool_shed_decode( workflow_name )
+ repository = suc.get_tool_shed_repository_by_id( trans, repository_id )
+ changeset_revision = repository.changeset_revision
+ metadata = repository.metadata
+ return trans.fill_template( "/admin/tool_shed_repository/view_workflow.mako",
+ repository=repository,
+ changeset_revision=changeset_revision,
+ repository_id=repository_id,
+ workflow_name=workflow_name,
+ metadata=metadata,
+ message=message,
+ status=status )
## ---- Utility methods -------------------------------------------------------
def build_shed_tool_conf_select_field( trans ):
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -2,7 +2,7 @@
from elementtree.ElementTree import Element
from galaxy import web
from galaxy.tools.parameters import DataToolParameter, DummyDataset, RuntimeValue, check_param, visit_input_values
-from galaxy.tools import DefaultToolState
+import galaxy.tools
from galaxy.util.bunch import Bunch
from galaxy.util.json import from_json_string, to_json_string
from galaxy.jobs.actions.post import ActionBox
@@ -139,7 +139,7 @@
return dict( input=DataToolParameter( None, Element( "param", name="input", label=label, multiple=True, type="data", format=', '.join(filter_set) ), self.trans ) )
def get_runtime_state( self ):
- state = DefaultToolState()
+ state = galaxy.tools.DefaultToolState()
state.inputs = dict( input=None )
return state
@@ -149,7 +149,7 @@
def decode_runtime_state( self, trans, string ):
fake_tool = Bunch( inputs = self.get_runtime_inputs() )
- state = DefaultToolState()
+ state = galaxy.tools.DefaultToolState()
state.decode( string, fake_tool, trans.app )
return state
@@ -192,7 +192,7 @@
def from_dict( Class, trans, d, secure=True ):
tool_id = d[ 'tool_id' ]
module = Class( trans, tool_id )
- module.state = DefaultToolState()
+ module.state = galaxy.tools.DefaultToolState()
if module.tool is not None:
module.state.decode( d[ "tool_state" ], module.tool, module.trans.app, secure=secure )
module.errors = d.get( "tool_errors", None )
@@ -213,7 +213,7 @@
tool_id = tool.id
if ( trans.app.toolbox and tool_id in trans.app.toolbox.tools_by_id ):
module = Class( trans, tool_id )
- module.state = DefaultToolState()
+ module.state = galaxy.tools.DefaultToolState()
module.state.inputs = module.tool.params_from_strings( step.tool_inputs, trans.app, ignore_errors=True )
module.errors = step.tool_errors
# module.post_job_actions = step.post_job_actions
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 templates/admin/tool_shed_repository/view_workflow.mako
--- /dev/null
+++ b/templates/admin/tool_shed_repository/view_workflow.mako
@@ -0,0 +1,45 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+<%namespace file="/webapps/community/common/common.mako" import="*" />
+<%namespace file="/webapps/community/repository/common.mako" import="*" />
+
+<%
+ from galaxy.web.framework.helpers import time_ago
+ from galaxy.tool_shed.encoding_util import tool_shed_encode
+%>
+
+<%!
+ def inherit(context):
+ if context.get('use_panels'):
+ return '/webapps/community/base_panels.mako'
+ else:
+ return '/base.mako'
+%>
+<%inherit file="${inherit(context)}"/>
+
+<%def name="render_workflow( workflow_name, repository_id )">
+ <% center_url = h.url_for( controller='admin_toolshed', action='generate_workflow_image', workflow_name=tool_shed_encode( workflow_name ), repository_id=repository_id ) %>
+ <iframe name="galaxy_main" id="galaxy_main" frameborder="0" style="position: absolute; width: 100%; height: 100%;" src="${center_url}"></iframe>
+</%def>
+
+<br/><br/>
+<ul class="manage-table-actions">
+ <li><a class="action-button" id="repository-${repository.id}-popup" class="menubutton">Repository Actions</a></li>
+ <div popupmenu="repository-${repository.id}-popup">
+ <li><a class="action-button" href="${h.url_for( controller='admin_toolshed', action='import_workflow', workflow_name=tool_shed_encode( workflow_name ), repository_id=repository_id )}">Import workflow to Galaxy</a></li>
+ </div>
+</ul>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<div class="toolFormTitle">${workflow_name | h}</div>
+<div class="form-row">
+ <div class="toolParamHelp" style="clear: both;">
+ (this page displays SVG graphics)
+ </div>
+</div>
+<br clear="left"/>
+
+${render_workflow( workflow_name, repository_id )}
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 templates/webapps/community/repository/common.mako
--- a/templates/webapps/community/repository/common.mako
+++ b/templates/webapps/community/repository/common.mako
@@ -231,6 +231,10 @@
folder_label = "%s<i> - this repository's tools require handling of these dependencies</i>" % folder_label
col_span_str = 'colspan="4"'
elif folder.workflows:
+ if folder.description:
+ folder_label = "%s<i> - %s</i>" % ( folder_label, folder.description )
+ else:
+ folder_label = "%s<i> - click the name to view an SVG image of the workflow</i>" % folder_label
col_span_str = 'colspan="4"'
%><td ${col_span_str} style="padding-left: ${folder_pad}px;">
@@ -439,7 +443,11 @@
<a class="action-button" href="${h.url_for( controller='repository', action='view_tool_metadata', repository_id=trans.security.encode_id( tool.repository_id ), changeset_revision=tool.changeset_revision, tool_id=tool.tool_id )}">View tool metadata</a></div>
%else:
- <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='view_tool_metadata', repository_id=trans.security.encode_id( tool.repository_id ), changeset_revision=tool.changeset_revision, tool_id=tool.tool_id )}">${tool.name | h}</a>
+ %if tool.repository_installation_status == trans.model.ToolShedRepository.installation_status.INSTALLED:
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='view_tool_metadata', repository_id=trans.security.encode_id( tool.repository_id ), changeset_revision=tool.changeset_revision, tool_id=tool.tool_id )}">${tool.name | h}</a>
+ %else:
+ ${tool.name | h}
+ %endif
%endif
%else:
${tool.name | h}
@@ -515,6 +523,13 @@
<%
from galaxy.tool_shed.encoding_util import tool_shed_encode
encoded_id = trans.security.encode_id( workflow.id )
+ encoded_workflow_name = tool_shed_encode( workflow.workflow_name )
+ if trans.webapp.name == 'community':
+ encoded_repository_metadata_id = trans.security.encode_id( workflow.repository_metadata_id )
+ encoded_repository_id = None
+ else:
+ encoded_repository_metadata_id = None
+ encoded_repository_id = trans.security.encode_id( workflow.repository_id )
if row_is_header:
cell_type = 'th'
else:
@@ -528,8 +543,12 @@
<${cell_type} style="padding-left: ${pad+20}px;">
%if row_is_header:
${workflow.workflow_name | h}
+ %elif trans.webapp.name == 'community' and encoded_repository_metadata_id:
+ <a href="${h.url_for( controller='repository', action='view_workflow', workflow_name=encoded_workflow_name, repository_metadata_id=encoded_repository_metadata_id )}">${workflow.workflow_name | h}</a>
+ %elif trans.webapp.name == 'galaxy' and encoded_repository_id:
+ <a href="${h.url_for( controller='admin_toolshed', action='view_workflow', workflow_name=encoded_workflow_name, repository_id=encoded_repository_id )}">${workflow.workflow_name | h}</a>
%else:
- <a href="${h.url_for( controller='workflow', action='view_workflow', repository_metadata_id=trans.security.encode_id( workflow.repository_metadata_id ), workflow_name=tool_shed_encode( workflow.workflow_name ) )}">${workflow.workflow_name | h}</a>
+ ${workflow.workflow_name | h}
%endif
</${cell_type}><${cell_type}>${workflow.steps | h}</${cell_type}>
@@ -557,7 +576,7 @@
missing_repository_dependencies_root_folder = containers_dict.get( 'missing_repository_dependencies', None )
tool_dependencies_root_folder = containers_dict.get( 'tool_dependencies', None )
missing_tool_dependencies_root_folder = containers_dict.get( 'missing_tool_dependencies', None )
- valid_tools_root_folder = containers_dict.get( 'valid_tools', none )
+ valid_tools_root_folder = containers_dict.get( 'valid_tools', None )
workflows_root_folder = containers_dict.get( 'workflows', None )
has_contents = datatypes_root_folder or invalid_tools_root_folder or valid_tools_root_folder or workflows_root_folder
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 templates/webapps/community/repository/view_workflow.mako
--- a/templates/webapps/community/repository/view_workflow.mako
+++ b/templates/webapps/community/repository/view_workflow.mako
@@ -7,22 +7,23 @@
from galaxy.web.framework.helpers import time_ago
from galaxy.tool_shed.encoding_util import tool_shed_encode
- in_tool_shed = trans.webapp.name == 'community'
is_admin = trans.user_is_admin()
is_new = repository.is_new( trans.app )
can_manage = is_admin or trans.user == repository.user
- can_contact_owner = in_tool_shed and trans.user and trans.user != repository.user
- can_push = in_tool_shed and trans.app.security_agent.can_push( trans.app, trans.user, repository )
+ can_contact_owner = trans.user and trans.user != repository.user
+ can_push = trans.app.security_agent.can_push( trans.app, trans.user, repository )
can_upload = can_push
- can_download = in_tool_shed and not is_new and ( not is_malicious or can_push )
- can_browse_contents = in_tool_shed and not is_new
- can_rate = in_tool_shed and not is_new and trans.user and repository.user != trans.user
- can_view_change_log = in_tool_shed and not is_new
+ can_download = not is_new and ( not is_malicious or can_push )
+ can_browse_contents = not is_new
+ can_rate = not is_new and trans.user and repository.user != trans.user
+ can_view_change_log = not is_new
if can_push:
browse_label = 'Browse or delete repository tip files'
else:
browse_label = 'Browse repository tip files'
- has_readme = metadata and 'readme' in metadata
+ has_readme = metadata and 'readme_files' in metadata
+
+ # <li><a class="action-button" href="${h.url_for( controller='repository', action='install_repositories_by_revision', repository_ids=trans.security.encode_id( repository.id ), changeset_revisions=changeset_revision )}">Install repository to Galaxy</a></li>
%><%!
@@ -34,57 +35,43 @@
%><%inherit file="${inherit(context)}"/>
-<%def name="render_workflow( repository_metadata_id, workflow_name )">
- <% center_url = h.url_for( controller='workflow', action='generate_workflow_image', repository_metadata_id=repository_metadata_id, workflow_name=tool_shed_encode( workflow_name ) ) %>
+<%def name="render_workflow( workflow_name, repository_metadata_id )">
+ <% center_url = h.url_for( controller='repository', action='generate_workflow_image', workflow_name=tool_shed_encode( workflow_name ), repository_metadata_id=repository_metadata_id, repository_id=None ) %><iframe name="galaxy_main" id="galaxy_main" frameborder="0" style="position: absolute; width: 100%; height: 100%;" src="${center_url}"></iframe></%def><br/><br/><ul class="manage-table-actions">
- %if in_tool_shed:
- %if is_new and can_upload:
- <a class="action-button" href="${h.url_for( controller='upload', action='upload', repository_id=trans.security.encode_id( repository.id ) )}">Upload files to repository</a>
- %else:
- <li><a class="action-button" id="repository-${repository.id}-popup" class="menubutton">Repository Actions</a></li>
- <div popupmenu="repository-${repository.id}-popup">
- %if can_upload:
- <a class="action-button" href="${h.url_for( controller='upload', action='upload', repository_id=trans.security.encode_id( repository.id ) )}">Upload files to repository</a>
- %endif
- %if can_manage:
- <a class="action-button" href="${h.url_for( controller='repository', action='manage_repository', id=trans.app.security.encode_id( repository.id ), changeset_revision=repository.tip( trans.app ) )}">Manage repository</a>
- %else:
- <a class="action-button" href="${h.url_for( controller='repository', action='view_repository', id=trans.app.security.encode_id( repository.id ), changeset_revision=repository.tip( trans.app ) )}">View repository</a>
- %endif
- %if can_view_change_log:
- <a class="action-button" href="${h.url_for( controller='repository', action='view_changelog', id=trans.app.security.encode_id( repository.id ) )}">View change log</a>
- %endif
- %if can_rate:
- <a class="action-button" href="${h.url_for( controller='repository', action='rate_repository', id=trans.app.security.encode_id( repository.id ) )}">Rate repository</a>
- %endif
- %if can_browse_contents:
- <a class="action-button" href="${h.url_for( controller='repository', action='browse_repository', id=trans.app.security.encode_id( repository.id ) )}">${browse_label}</a>
- %endif
- %if can_contact_owner:
- <a class="action-button" href="${h.url_for( controller='repository', action='contact_owner', id=trans.security.encode_id( repository.id ) )}">Contact repository owner</a>
- %endif
- %if can_download:
- <a class="action-button" href="${h.url_for( controller='repository', action='download', repository_id=trans.app.security.encode_id( repository.id ), changeset_revision=changeset_revision, file_type='gz' )}">Download as a .tar.gz file</a>
- <a class="action-button" href="${h.url_for( controller='repository', action='download', repository_id=trans.app.security.encode_id( repository.id ), changeset_revision=changeset_revision, file_type='bz2' )}">Download as a .tar.bz2 file</a>
- <a class="action-button" href="${h.url_for( controller='repository', action='download', repository_id=trans.app.security.encode_id( repository.id ), changeset_revision=changeset_revision, file_type='zip' )}">Download as a zip file</a>
- %endif
- </div>
- %endif
+ %if is_new and can_upload:
+ <a class="action-button" href="${h.url_for( controller='upload', action='upload', repository_id=trans.security.encode_id( repository.id ) )}">Upload files to repository</a>
%else:
<li><a class="action-button" id="repository-${repository.id}-popup" class="menubutton">Repository Actions</a></li><div popupmenu="repository-${repository.id}-popup">
- <li><a class="action-button" href="${h.url_for( controller='workflow', action='import_workflow', repository_metadata_id=repository_metadata_id, workflow_name=tool_shed_encode( workflow_name ) )}">Import workflow to local Galaxy</a></li>
- <li><a class="action-button" href="${h.url_for( controller='repository', action='install_repositories_by_revision', repository_ids=trans.security.encode_id( repository.id ), changeset_revisions=changeset_revision )}">Install repository to local Galaxy</a></li>
- </div>
- <li><a class="action-button" id="toolshed-${repository.id}-popup" class="menubutton">Tool Shed Actions</a></li>
- <div popupmenu="toolshed-${repository.id}-popup">
- <a class="action-button" href="${h.url_for( controller='repository', action='browse_valid_categories' )}">Browse valid repositories</a>
- <a class="action-button" href="${h.url_for( controller='repository', action='find_tools' )}">Search for valid tools</a>
- <a class="action-button" href="${h.url_for( controller='repository', action='find_workflows' )}">Search for workflows</a>
+ %if can_upload:
+ <a class="action-button" href="${h.url_for( controller='upload', action='upload', repository_id=trans.security.encode_id( repository.id ) )}">Upload files to repository</a>
+ %endif
+ %if can_manage:
+ <a class="action-button" href="${h.url_for( controller='repository', action='manage_repository', id=trans.app.security.encode_id( repository.id ), changeset_revision=repository.tip( trans.app ) )}">Manage repository</a>
+ %else:
+ <a class="action-button" href="${h.url_for( controller='repository', action='view_repository', id=trans.app.security.encode_id( repository.id ), changeset_revision=repository.tip( trans.app ) )}">View repository</a>
+ %endif
+ %if can_view_change_log:
+ <a class="action-button" href="${h.url_for( controller='repository', action='view_changelog', id=trans.app.security.encode_id( repository.id ) )}">View change log</a>
+ %endif
+ %if can_rate:
+ <a class="action-button" href="${h.url_for( controller='repository', action='rate_repository', id=trans.app.security.encode_id( repository.id ) )}">Rate repository</a>
+ %endif
+ %if can_browse_contents:
+ <a class="action-button" href="${h.url_for( controller='repository', action='browse_repository', id=trans.app.security.encode_id( repository.id ) )}">${browse_label}</a>
+ %endif
+ %if can_contact_owner:
+ <a class="action-button" href="${h.url_for( controller='repository', action='contact_owner', id=trans.security.encode_id( repository.id ) )}">Contact repository owner</a>
+ %endif
+ %if can_download:
+ <a class="action-button" href="${h.url_for( controller='repository', action='download', repository_id=trans.app.security.encode_id( repository.id ), changeset_revision=changeset_revision, file_type='gz' )}">Download as a .tar.gz file</a>
+ <a class="action-button" href="${h.url_for( controller='repository', action='download', repository_id=trans.app.security.encode_id( repository.id ), changeset_revision=changeset_revision, file_type='bz2' )}">Download as a .tar.bz2 file</a>
+ <a class="action-button" href="${h.url_for( controller='repository', action='download', repository_id=trans.app.security.encode_id( repository.id ), changeset_revision=changeset_revision, file_type='zip' )}">Download as a zip file</a>
+ %endif
</div>
%endif
</ul>
@@ -102,4 +89,4 @@
</div><br clear="left"/>
-${render_workflow( repository_metadata_id, workflow_name )}
+${render_workflow( workflow_name, repository_metadata_id )}
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 test/tool_shed/base/twilltestcase.py
--- a/test/tool_shed/base/twilltestcase.py
+++ b/test/tool_shed/base/twilltestcase.py
@@ -454,9 +454,11 @@
( self.security.encode_id( repository.id ), tool_xml_path, changeset_revision )
self.visit_url( url )
self.check_for_strings( strings_displayed, strings_not_displayed )
- def load_workflow_image( self, repository, workflow_name, strings_displayed=[], strings_not_displayed=[] ):
+ def load_workflow_image_in_tool_shed( self, repository, workflow_name, strings_displayed=[], strings_not_displayed=[] ):
+ # FIXME: Can not always assume the first repository_metadata record is the correct one.
+ # TODO: Add a method for displaying a workflow image in Galaxy.
metadata = self.get_repository_metadata( repository )
- url = '/workflow/generate_workflow_image?repository_metadata_id=%s&workflow_name=%s' % \
+ url = '/repository/generate_workflow_image?repository_metadata_id=%s&workflow_name=%s' % \
( self.security.encode_id( metadata[0].id ), tool_shed_encode( workflow_name ) )
self.visit_url( url )
self.check_for_strings( strings_displayed, strings_not_displayed )
diff -r 875ac898df00fd919b6b24f58562fadbf03dc5e1 -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 test/tool_shed/functional/test_0060_workflows.py
--- a/test/tool_shed/functional/test_0060_workflows.py
+++ b/test/tool_shed/functional/test_0060_workflows.py
@@ -46,7 +46,7 @@
workflow_filename,
filepath=workflow_filepath,
commit_message='Uploaded filtering workflow.' )
- self.load_workflow_image( repository, workflow_name, strings_displayed=[ '#EBBCB2' ] )
+ self.load_workflow_image_in_tool_shed( repository, workflow_name, strings_displayed=[ '#EBBCB2' ] )
def test_0020_upload_tool( self ):
'''Upload the missing tool for the workflow in the previous step, and verify that the error is no longer present.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
@@ -54,7 +54,7 @@
'filtering/filtering_2.2.0.tar',
commit_message="Uploaded filtering 2.2.0",
remove_repo_files_not_in_tar='No' )
- self.load_workflow_image( repository, workflow_name, strings_not_displayed=[ '#EBBCB2' ] )
+ self.load_workflow_image_in_tool_shed( repository, workflow_name, strings_not_displayed=[ '#EBBCB2' ] )
def test_0025_verify_repository_metadata( self ):
'''Verify that resetting the metadata does not change it.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0