galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
September 2013
- 1 participants
- 149 discussions
commit/galaxy-central: greg: Enable repository dependencies to be specially categorized if they should be installed only to allow the dependent repository's tool dependency to be compiled (in most cases the tool dependency should hopefully be installed as a compiled binary).
by commits-noreply@bitbucket.org 09 Sep '13
by commits-noreply@bitbucket.org 09 Sep '13
09 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/ed3c78662274/
Changeset: ed3c78662274
User: greg
Date: 2013-09-09 20:26:58
Summary: Enable repository dependencies to be specially categorized if they should be installed only to allow the dependent repository's tool dependency to be compiled (in most cases the tool dependency should hopefully be installed as a compiled binary).
Affected #: 11 files
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -3571,6 +3571,10 @@
@property
def repository_dependencies( self ):
+ """
+ Return all of this repository's repository dependencies, ignoring their attributes like prior_installation_required and
+ only_if_compiling_contained_td.
+ """
required_repositories = []
for rrda in self.required_repositories:
repository_dependency = rrda.repository_dependency
@@ -3608,7 +3612,7 @@
if required_repository.status == self.installation_status.ERROR:
required_repositories_with_installation_errors.append( required_repository )
return required_repositories_with_installation_errors
-
+
@property
def requires_prior_installation_of( self ):
"""
@@ -3624,15 +3628,18 @@
if self.has_repository_dependencies:
rd_tups = self.metadata[ 'repository_dependencies' ][ 'repository_dependencies' ]
for rd_tup in rd_tups:
- if len( rd_tup ) == 4:
- # For backward compatibility to the 12/20/12 Galaxy release, default prior_installation_required to False.
- tool_shed, name, owner, changeset_revision = rd_tup
- prior_installation_required = False
- elif len( rd_tup ) == 5:
+ if len( rd_tup ) == 5:
tool_shed, name, owner, changeset_revision, prior_installation_required = rd_tup
- prior_installation_required = galaxy.util.asbool( str( prior_installation_required ) )
- if prior_installation_required:
- required_rd_tups_that_must_be_installed.append( ( tool_shed, name, owner, changeset_revision, prior_installation_required ) )
+ if galaxy.util.asbool( prior_installation_required ):
+ required_rd_tups_that_must_be_installed.append( ( tool_shed, name, owner, changeset_revision, 'True', 'False' ) )
+ elif len( rd_tup ) == 6:
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = rd_tup
+ # The repository dependency will only be required to be previsously installed if it does not fall into the category of
+ # a repository that must be installed only so that it's contained tool dependency can be used for compiling the tool
+ # dependency of the dependent repository.
+ if not galaxy.util.asbool( only_if_compiling_contained_td ):
+ if galaxy.util.asbool( prior_installation_required ):
+ required_rd_tups_that_must_be_installed.append( ( tool_shed, name, owner, changeset_revision, 'True', 'False' ) )
return required_rd_tups_that_must_be_installed
@property
@@ -3686,6 +3693,19 @@
return tool_shed_url.rstrip( '/' )
@property
+ def tuples_of_repository_dependencies_needed_for_compiling_td( self ):
+ """Return this repository's repository dependencies that are necessary only for compiling this repository's tool dependencies."""
+ rd_tups_of_repositories_needed_for_compiling_td = []
+ if self.has_repository_dependencies:
+ rd_tups = self.metadata[ 'repository_dependencies' ][ 'repository_dependencies' ]
+ for rd_tup in rd_tups:
+ if len( rd_tup ) == 6:
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = rd_tup
+ if galaxy.util.asbool( only_if_compiling_contained_td ):
+ rd_tups_of_repositories_needed_for_compiling_td.append( ( tool_shed, name, owner, changeset_revision, 'False', 'True' ) )
+ return rd_tups_of_repositories_needed_for_compiling_td
+
+ @property
def uninstalled_repository_dependencies( self ):
"""Return the repository's repository dependencies that have been uninstalled."""
uninstalled_required_repositories = []
@@ -3723,6 +3743,7 @@
def __init__( self, tool_shed_repository_id=None ):
self.tool_shed_repository_id = tool_shed_repository_id
+
class ToolDependency( object ):
installation_status = Bunch( NEVER_INSTALLED='Never installed',
INSTALLING='Installing',
@@ -3734,6 +3755,7 @@
WARNING = 'queued',
ERROR = 'error',
UNINSTALLED = 'deleted_new' )
+
def __init__( self, tool_shed_repository_id=None, name=None, version=None, type=None, status=None, error_message=None ):
self.tool_shed_repository_id = tool_shed_repository_id
self.name = name
@@ -3741,21 +3763,26 @@
self.type = type
self.status = status
self.error_message = error_message
+
@property
def can_install( self ):
return self.status in [ self.installation_status.NEVER_INSTALLED, self.installation_status.UNINSTALLED ]
+
@property
def can_uninstall( self ):
return self.status in [ self.installation_status.ERROR, self.installation_status.INSTALLED ]
+
@property
def can_update( self ):
return self.status in [ self.installation_status.NEVER_INSTALLED,
self.installation_status.INSTALLED,
self.installation_status.ERROR,
self.installation_status.UNINSTALLED ]
+
@property
def in_error_state( self ):
return self.status == self.installation_status.ERROR
+
def installation_directory( self, app ):
if self.type == 'package':
return os.path.join( app.config.tool_dependency_dir,
@@ -3772,6 +3799,7 @@
self.tool_shed_repository.name,
self.tool_shed_repository.installed_changeset_revision )
+
class ToolVersion( object, Dictifiable ):
dict_element_visible_keys = ( 'id', 'tool_shed_repository' )
def __init__( self, id=None, create_time=None, tool_id=None, tool_shed_repository=None ):
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/galaxy/webapps/tool_shed/controllers/repository.py
--- a/lib/galaxy/webapps/tool_shed/controllers/repository.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/repository.py
@@ -1648,7 +1648,7 @@
encoded_repository_ids = []
changeset_revisions = []
for required_repository_tup in decoded_required_repository_tups:
- tool_shed, name, owner, changeset_revision, prior_installation_required = \
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
common_util.parse_repository_dependency_tuple( required_repository_tup )
repository = suc.get_repository_by_name_and_owner( trans.app, name, owner )
encoded_repository_ids.append( trans.security.encode_id( repository.id ) )
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/galaxy_install/repository_util.py
--- a/lib/tool_shed/galaxy_install/repository_util.py
+++ b/lib/tool_shed/galaxy_install/repository_util.py
@@ -101,7 +101,8 @@
# rd_key is something like: 'http://localhost:9009__ESEP__package_rdkit_2012_12__ESEP__test__ESEP__d635f…'
# rd_val is something like: [['http://localhost:9009', 'package_numpy_1_7', 'test', 'cddd64ecd985', 'True']]
try:
- tool_shed, name, owner, changeset_revision, prior_installation_required = container_util.get_components_from_key( rd_key )
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ container_util.get_components_from_key( rd_key )
except:
tool_shed, name, owner, changeset_revision = container_util.get_components_from_key( rd_val )
installed_repository = suc.get_tool_shed_repository_by_shed_name_owner_changeset_revision( trans.app, tool_shed, name, owner, changeset_revision )
@@ -109,7 +110,7 @@
installed_repositories.append( installed_repository )
for rd_val in rd_vals:
try:
- tool_shed, name, owner, changeset_revision, prior_installation_required = rd_val
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = rd_val
except:
tool_shed, name, owner, changeset_revision = rd_val
installed_repository = suc.get_tool_shed_repository_by_shed_name_owner_changeset_revision( trans.app, tool_shed, name, owner, changeset_revision )
@@ -627,7 +628,7 @@
folder_id += 1
# Generate the label by retrieving the repository name.
try:
- toolshed, name, owner, changeset_revision, prior_installation_required = \
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
container_util.get_components_from_key( old_container_repository_dependencies_folder.key )
except ValueError:
# For backward compatibility to the 12/20/12 Galaxy release.
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/util/common_install_util.py
--- a/lib/tool_shed/util/common_install_util.py
+++ b/lib/tool_shed/util/common_install_util.py
@@ -165,11 +165,20 @@
missing_rd_tups = []
for tsr in repository.repository_dependencies:
prior_installation_required = suc.set_prior_installation_required( repository, tsr )
- rd_tup = [ tsr.tool_shed, tsr.name, tsr.owner, tsr.changeset_revision, prior_installation_required, tsr.id, tsr.status ]
+ only_if_compiling_contained_td = suc.set_only_if_compiling_contained_td( repository, tsr )
+ rd_tup = [ tsr.tool_shed, tsr.name, tsr.owner, tsr.changeset_revision, prior_installation_required, only_if_compiling_contained_td, tsr.id, tsr.status ]
if tsr.status == trans.model.ToolShedRepository.installation_status.INSTALLED:
installed_rd_tups.append( rd_tup )
else:
- missing_rd_tups.append( rd_tup )
+ # We'll only add the rd_tup to the missing_rd_tups list if the received repository has tool dependencies that are not
+ # correctly installed. This may prove to be a weak check since the repository in question may not have anything to do
+ # with compiling the missing tool dependencies. If we discover that this is a problem, more granular checking will be
+ # necessary here.
+ if repository.missing_tool_dependencies:
+ if not repository_dependency_needed_only_for_compiling_tool_dependency( repository, tsr ):
+ missing_rd_tups.append( rd_tup )
+ else:
+ missing_rd_tups.append( rd_tup )
if installed_rd_tups or missing_rd_tups:
# Get the description from the metadata in case it has a value.
repository_dependencies = metadata.get( 'repository_dependencies', {} )
@@ -180,7 +189,8 @@
repository.name,
repository.owner,
repository.installed_changeset_revision,
- prior_installation_required )
+ prior_installation_required,
+ only_if_compiling_contained_td )
if installed_rd_tups:
installed_repository_dependencies[ 'root_key' ] = root_key
installed_repository_dependencies[ root_key ] = installed_rd_tups
@@ -211,7 +221,8 @@
if key in [ 'description', 'root_key' ]:
continue
for rd_tup in rd_tups:
- tool_shed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( rd_tup )
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( rd_tup )
# Updates to installed repository revisions may have occurred, so make sure to locate the appropriate repository revision if one exists.
# We need to create a temporary repo_info_tuple that includes the correct repository owner which we get from the current rd_tup. The current
# tuple looks like: ( description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, installed_td )
@@ -219,17 +230,39 @@
tmp_repo_info_tuple = ( None, tmp_clone_url, changeset_revision, None, owner, None, None )
repository, current_changeset_revision = suc.repository_was_previously_installed( trans, tool_shed, name, tmp_repo_info_tuple )
if repository:
- new_rd_tup = [ tool_shed, name, owner, changeset_revision, str( prior_installation_required ), repository.id, repository.status ]
+ new_rd_tup = [ tool_shed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ only_if_compiling_contained_td,
+ repository.id,
+ repository.status ]
if repository.status == trans.model.ToolShedRepository.installation_status.INSTALLED:
if new_rd_tup not in installed_rd_tups:
installed_rd_tups.append( new_rd_tup )
else:
+ # A repository dependency that is not installed will not be considered missing if it's value for only_if_compiling_contained_td is
+ # True This is because this type of repository dependency will only be considered at the time that the specified tool dependency
+ # is being installed, and even then only if the compiled binary of the tool dependency could not be installed due to the unsupported
+ # installation environment.
+ if not util.asbool( only_if_compiling_contained_td ):
+ if new_rd_tup not in missing_rd_tups:
+ missing_rd_tups.append( new_rd_tup )
+ else:
+ new_rd_tup = [ tool_shed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ only_if_compiling_contained_td,
+ None,
+ 'Never installed' ]
+ if not util.asbool( only_if_compiling_contained_td ):
+ # A repository dependency that is not installed will not be considered missing if it's value for only_if_compiling_contained_td is
+ # True - see above...
if new_rd_tup not in missing_rd_tups:
missing_rd_tups.append( new_rd_tup )
- else:
- new_rd_tup = [ tool_shed, name, owner, changeset_revision, str( prior_installation_required ), None, 'Never installed' ]
- if new_rd_tup not in missing_rd_tups:
- missing_rd_tups.append( new_rd_tup )
if installed_rd_tups:
installed_repository_dependencies[ 'root_key' ] = root_key
installed_repository_dependencies[ root_key ] = installed_rd_tups
@@ -286,20 +319,32 @@
if key in [ 'root_key', 'description' ]:
continue
try:
- toolshed, name, owner, changeset_revision, prior_installation_required = container_util.get_components_from_key( key )
- components_list = [ toolshed, name, owner, changeset_revision, prior_installation_required ]
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ container_util.get_components_from_key( key )
+ components_list = [ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td ]
except ValueError:
- # For backward compatibility to the 12/20/12 Galaxy release, default prior_installation_required to False in the caller.
+ # For backward compatibility to the 12/20/12 Galaxy release, default prior_installation_required and only_if_compiling_contained_td
+ # to False in the caller.
toolshed, name, owner, changeset_revision = container_util.get_components_from_key( key )
components_list = [ toolshed, name, owner, changeset_revision ]
- if components_list not in required_repository_tups:
- required_repository_tups.append( components_list )
- for components_list in val:
+ only_if_compiling_contained_td = 'False'
+ # Skip listing a repository dependency if it is required only to compile a tool dependency defined for the dependent repository.
+ if not util.asbool( only_if_compiling_contained_td ):
if components_list not in required_repository_tups:
required_repository_tups.append( components_list )
+ for components_list in val:
+ try:
+ only_if_compiling_contained_td = components_list[ 5 ]
+ except:
+ only_if_compiling_contained_td = 'False'
+ # TODO: Fix this to display the tool dependency if only_if_compiling_contained_td is True, but clarify that installation may not
+ # happen.
+ if not util.asbool( only_if_compiling_contained_td ):
+ if components_list not in required_repository_tups:
+ required_repository_tups.append( components_list )
else:
# We have a single repository with no dependencies.
- components_list = [ tool_shed_url, repository_name, repository_owner, changeset_revision, 'False' ]
+ components_list = [ tool_shed_url, repository_name, repository_owner, changeset_revision, 'False', 'False' ]
required_repository_tups.append( components_list )
if required_repository_tups:
# The value of required_repository_tups is a list of tuples, so we need to encode it.
@@ -393,3 +438,16 @@
app.model.ToolDependency.installation_status.ERROR ]:
installed_tool_dependencies.append( tool_dependency )
return installed_tool_dependencies
+
+def repository_dependency_needed_only_for_compiling_tool_dependency( repository, repository_dependency ):
+ for rd_tup in repository.tuples_of_repository_dependencies_needed_for_compiling_td:
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = rd_tup
+ # TODO: we may discover that we need to check more than just installed_changeset_revision and changeset_revision here, in which
+ # case we'll need to contact the tool shed to get the list of all possible changeset_revisions.
+ if repository_dependency.tool_shed == tool_shed and \
+ repository_dependency.name == name and \
+ repository_dependency.owner == owner and \
+ ( repository_dependency.installed_changeset_revision == changeset_revision or \
+ repository_dependency.changeset_revision == changeset_revision ):
+ return True
+ return False
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/util/common_util.py
--- a/lib/tool_shed/util/common_util.py
+++ b/lib/tool_shed/util/common_util.py
@@ -46,7 +46,8 @@
if rd_key in [ 'root_key', 'description' ]:
continue
for rd_tup in rd_tups:
- tool_shed, name, owner, changeset_revision, prior_installation_required = parse_repository_dependency_tuple( rd_tup )
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td \
+ = parse_repository_dependency_tuple( rd_tup )
tool_shed_accessible, tool_dependencies = get_tool_dependencies( app,
tool_shed,
name,
@@ -155,26 +156,26 @@
return None
def parse_repository_dependency_tuple( repository_dependency_tuple, contains_error=False ):
+ # Default both prior_installation_required and only_if_compiling_contained_td to False in cases where metadata should be reset on the
+ # repository containing the repository_dependency definition.
+ prior_installation_required = 'False'
+ only_if_compiling_contained_td = 'False'
if contains_error:
if len( repository_dependency_tuple ) == 5:
- # Metadata should have been reset on the repository containing this repository_dependency definition.
tool_shed, name, owner, changeset_revision, error = repository_dependency_tuple
- # Default prior_installation_required to False.
- prior_installation_required = False
elif len( repository_dependency_tuple ) == 6:
toolshed, name, owner, changeset_revision, prior_installation_required, error = repository_dependency_tuple
- prior_installation_required = str( prior_installation_required )
- return toolshed, name, owner, changeset_revision, prior_installation_required, error
+ elif len( repository_dependency_tuple ) == 7:
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, error = repository_dependency_tuple
+ return toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, error
else:
if len( repository_dependency_tuple ) == 4:
- # Metadata should have been reset on the repository containing this repository_dependency definition.
tool_shed, name, owner, changeset_revision = repository_dependency_tuple
- # Default prior_installation_required to False.
- prior_installation_required = False
elif len( repository_dependency_tuple ) == 5:
tool_shed, name, owner, changeset_revision, prior_installation_required = repository_dependency_tuple
- prior_installation_required = str( prior_installation_required )
- return tool_shed, name, owner, changeset_revision, prior_installation_required
+ elif len( repository_dependency_tuple ) == 6:
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = repository_dependency_tuple
+ return tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td
def tool_shed_get( app, tool_shed_url, uri ):
"""Make contact with the tool shed via the uri provided."""
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/util/container_util.py
--- a/lib/tool_shed/util/container_util.py
+++ b/lib/tool_shed/util/container_util.py
@@ -64,13 +64,15 @@
self.repository_dependencies.remove( contained_repository_dependency )
def to_repository_dependency( self, repository_dependency_id ):
- toolshed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( self.key.split( STRSEP ) )
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( self.key.split( STRSEP ) )
return RepositoryDependency( id=repository_dependency_id,
toolshed=toolshed,
repository_name=name,
repository_owner=owner,
changeset_revision=changeset_revision,
- prior_installation_required=asbool( prior_installation_required ) )
+ prior_installation_required=asbool( prior_installation_required ),
+ only_if_compiling_contained_td=asbool( only_if_compiling_contained_td ) )
class DataManager( object ):
@@ -120,13 +122,15 @@
class InvalidRepositoryDependency( object ):
"""Invalid repository dependency definition object"""
- def __init__( self, id=None, toolshed=None, repository_name=None, repository_owner=None, changeset_revision=None, prior_installation_required=False, error=None ):
+ def __init__( self, id=None, toolshed=None, repository_name=None, repository_owner=None, changeset_revision=None, prior_installation_required=False,
+ only_if_compiling_contained_td=False, error=None ):
self.id = id
self.toolshed = toolshed
self.repository_name = repository_name
self.repository_owner = repository_owner
self.changeset_revision = changeset_revision
self.prior_installation_required = prior_installation_required
+ self.only_if_compiling_contained_td = only_if_compiling_contained_td
self.error = error
@@ -194,19 +198,25 @@
"""Repository dependency object"""
def __init__( self, id=None, toolshed=None, repository_name=None, repository_owner=None, changeset_revision=None, prior_installation_required=False,
- installation_status=None, tool_shed_repository_id=None ):
+ only_if_compiling_contained_td=False, installation_status=None, tool_shed_repository_id=None ):
self.id = id
self.toolshed = toolshed
self.repository_name = repository_name
self.repository_owner = repository_owner
self.changeset_revision = changeset_revision
self.prior_installation_required = prior_installation_required
+ self.only_if_compiling_contained_td = only_if_compiling_contained_td
self.installation_status = installation_status
self.tool_shed_repository_id = tool_shed_repository_id
@property
def listify( self ):
- return [ self.toolshed, self.repository_name, self.repository_owner, self.changeset_revision, asbool( str( self.prior_installation_required ) ) ]
+ return [ self.toolshed,
+ self.repository_name,
+ self.repository_owner,
+ self.changeset_revision,
+ self.prior_installation_required,
+ self.only_if_compiling_contained_td ]
class RepositoryInstallationError( object ):
@@ -463,9 +473,14 @@
for invalid_repository_dependency in invalid_repository_dependencies:
folder_id += 1
invalid_repository_dependency_id += 1
- toolshed, name, owner, changeset_revision, prior_installation_required, error = \
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, error = \
common_util.parse_repository_dependency_tuple( invalid_repository_dependency, contains_error=True )
- key = generate_repository_dependencies_key_for_repository( toolshed, name, owner, changeset_revision, prior_installation_required )
+ key = generate_repository_dependencies_key_for_repository( toolshed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ only_if_compiling_contained_td )
label = "Repository <b>%s</b> revision <b>%s</b> owned by <b>%s</b>" % ( name, changeset_revision, owner )
folder = Folder( id=folder_id,
key=key,
@@ -477,6 +492,7 @@
repository_owner=owner,
changeset_revision=changeset_revision,
prior_installation_required=asbool( prior_installation_required ),
+ only_if_compiling_contained_td=asbool( only_if_compiling_contained_td ),
error=error )
folder.invalid_repository_dependencies.append( ird )
invalid_repository_dependencies_folder.folders.append( folder )
@@ -1230,12 +1246,13 @@
return cast_empty_repository_dependency_folders( sub_folder, repository_dependency_id )
return folder, repository_dependency_id
-def generate_repository_dependencies_folder_label_from_key( repository_name, repository_owner, changeset_revision, prior_installation_required, key ):
+def generate_repository_dependencies_folder_label_from_key( repository_name, repository_owner, changeset_revision, prior_installation_required,
+ only_if_compiling_contained_td, key ):
"""Return a repository dependency label based on the repository dependency key."""
- if key_is_current_repositorys_key( repository_name, repository_owner, changeset_revision, prior_installation_required, key ):
+ if key_is_current_repositorys_key( repository_name, repository_owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, key ):
label = 'Repository dependencies'
else:
- if prior_installation_required:
+ if asbool( prior_installation_required ):
prior_installation_required_str = " <i>(prior install required)</i>"
else:
prior_installation_required_str = ""
@@ -1243,9 +1260,10 @@
( repository_name, changeset_revision, repository_owner, prior_installation_required_str )
return label
-def generate_repository_dependencies_key_for_repository( toolshed_base_url, repository_name, repository_owner, changeset_revision, prior_installation_required ):
- # FIXME: assumes tool shed is current tool shed since repository dependencies across tool sheds is not yet supported.
- return '%s%s%s%s%s%s%s%s%s' % ( str( toolshed_base_url ).rstrip( '/' ),
+def generate_repository_dependencies_key_for_repository( toolshed_base_url, repository_name, repository_owner, changeset_revision, prior_installation_required,
+ only_if_compiling_contained_td ):
+ """Assumes tool shed is current tool shed since repository dependencies across tool sheds is not yet supported."""
+ return '%s%s%s%s%s%s%s%s%s%s%s' % ( str( toolshed_base_url ).rstrip( '/' ),
STRSEP,
str( repository_name ),
STRSEP,
@@ -1253,7 +1271,9 @@
STRSEP,
str( changeset_revision ),
STRSEP,
- str( prior_installation_required ) )
+ str( prior_installation_required ),
+ STRSEP,
+ str( only_if_compiling_contained_td ) )
def generate_tool_dependencies_key( name, version, type ):
return '%s%s%s%s%s' % ( str( name ), STRSEP, str( version ), STRSEP, str( type ) )
@@ -1266,22 +1286,27 @@
return None
def get_components_from_repository_dependency_for_installed_repository( trans, repository_dependency ):
- """
- Parse a repository dependency and return components necessary for proper display in Galaxy on the Manage repository page.
- """
+ """Parse a repository dependency and return components necessary for proper display in Galaxy on the Manage repository page."""
+ # Default prior_installation_required and only_if_compiling_contained_td to False.
+ prior_installation_required = 'False'
+ only_if_compiling_contained_td = 'False'
if len( repository_dependency ) == 6:
# Metadata should have been reset on this installed repository, but it wasn't.
tool_shed_repository_id = repository_dependency[ 4 ]
installation_status = repository_dependency[ 5 ]
tool_shed, name, owner, changeset_revision = repository_dependency[ 0:4 ]
- # Default prior_installation_required to False.
- prior_installation_required = False
- repository_dependency = [ tool_shed, name, owner, changeset_revision, prior_installation_required ]
+ repository_dependency = [ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td ]
elif len( repository_dependency ) == 7:
- # We have a repository dependency tuple that includes a prior_installation_required value.
+ # We have a repository dependency tuple that includes a prior_installation_required value but not a only_if_compiling_contained_td value.
tool_shed_repository_id = repository_dependency[ 5 ]
installation_status = repository_dependency[ 6 ]
- repository_dependency = repository_dependency[ 0:5 ]
+ tool_shed, name, owner, changeset_revision, prior_installation_required = repository_dependency[ 0:5 ]
+ repository_dependency = [ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td ]
+ elif len( repository_dependency ) == 8:
+ # We have a repository dependency tuple that includes both a prior_installation_required value and a only_if_compiling_contained_td value.
+ tool_shed_repository_id = repository_dependency[ 6 ]
+ installation_status = repository_dependency[ 7 ]
+ repository_dependency = repository_dependency[ 0:6 ]
else:
tool_shed_repository_id = None
installation_status = 'unknown'
@@ -1295,31 +1320,41 @@
return tool_shed_repository_id, installation_status, repository_dependency
def get_components_from_key( key ):
- # FIXME: assumes tool shed is current tool shed since repository dependencies across tool sheds is not yet supported.
+ """Assumes tool shed is current tool shed since repository dependencies across tool sheds is not yet supported."""
items = key.split( STRSEP )
toolshed_base_url = items[ 0 ]
repository_name = items[ 1 ]
repository_owner = items[ 2 ]
changeset_revision = items[ 3 ]
- if len( items ) == 5:
- prior_installation_required = asbool( str( items[ 4 ] ) )
- return toolshed_base_url, repository_name, repository_owner, changeset_revision, prior_installation_required
+ if len( items ) >= 5:
+ try:
+ prior_installation_required = items[ 4 ]
+ except:
+ prior_installation_required = 'False'
+ try:
+ only_if_compiling_contained_td = items[ 5 ]
+ except:
+ only_if_compiling_contained_td = 'False'
+ return toolshed_base_url, repository_name, repository_owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td
else:
# For backward compatibility to the 12/20/12 Galaxy release we have to return the following, and callers must handle exceptions.
return toolshed_base_url, repository_name, repository_owner, changeset_revision
def handle_repository_dependencies_container_entry( trans, repository_dependencies_folder, rd_key, rd_value, folder_id, repository_dependency_id, folder_keys ):
try:
- toolshed, repository_name, repository_owner, changeset_revision, prior_installation_required = get_components_from_key( rd_key )
+ toolshed, repository_name, repository_owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ get_components_from_key( rd_key )
except ValueError:
- # For backward compatibility to the 12/20/12 Galaxy release, default prior_installation_required to False.
+ # For backward compatibility to the 12/20/12 Galaxy release, default prior_installation_required and only_if_compiling_contained_td to 'False'.
toolshed, repository_name, repository_owner, changeset_revision = get_components_from_key( rd_key )
- prior_installation_required = False
+ prior_installation_required = 'False'
+ only_if_compiling_contained_td = 'False'
folder = get_folder( repository_dependencies_folder, rd_key )
label = generate_repository_dependencies_folder_label_from_key( repository_name,
repository_owner,
changeset_revision,
prior_installation_required,
+ only_if_compiling_contained_td,
repository_dependencies_folder.key )
if folder:
if rd_key not in folder_keys:
@@ -1351,7 +1386,7 @@
installation_status = None
can_create_dependency = not is_subfolder_of( sub_folder, repository_dependency )
if can_create_dependency:
- toolshed, repository_name, repository_owner, changeset_revision, prior_installation_required = \
+ toolshed, repository_name, repository_owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
common_util.parse_repository_dependency_tuple( repository_dependency )
repository_dependency_id += 1
repository_dependency = RepositoryDependency( id=repository_dependency_id,
@@ -1360,6 +1395,7 @@
repository_owner=repository_owner,
changeset_revision=changeset_revision,
prior_installation_required=asbool( prior_installation_required ),
+ only_if_compiling_contained_td=asbool( only_if_compiling_contained_td ),
installation_status=installation_status,
tool_shed_repository_id=tool_shed_repository_id )
# Insert the repository_dependency into the folder.
@@ -1367,25 +1403,35 @@
return repository_dependencies_folder, folder_id, repository_dependency_id
def is_subfolder_of( folder, repository_dependency ):
- toolshed, repository_name, repository_owner, changeset_revision, prior_installation_required = \
+ toolshed, repository_name, repository_owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
common_util.parse_repository_dependency_tuple( repository_dependency )
- key = generate_repository_dependencies_key_for_repository( toolshed, repository_name, repository_owner, changeset_revision, asbool( prior_installation_required ) )
+ key = generate_repository_dependencies_key_for_repository( toolshed,
+ repository_name,
+ repository_owner,
+ changeset_revision,
+ prior_installation_required,
+ only_if_compiling_contained_td )
for sub_folder in folder.folders:
if key == sub_folder.key:
return True
return False
-def key_is_current_repositorys_key( repository_name, repository_owner, changeset_revision, prior_installation_required, key ):
+def key_is_current_repositorys_key( repository_name, repository_owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, key ):
try:
- toolshed_base_url, key_name, key_owner, key_changeset_revision, key_prior_installation_required = get_components_from_key( key )
+ toolshed_base_url, key_name, key_owner, key_changeset_revision, key_prior_installation_required, key_only_if_compiling_contained_td = \
+ get_components_from_key( key )
except ValueError:
# For backward compatibility to the 12/20/12 Galaxy release, default key_prior_installation_required to False.
toolshed_base_url, key_name, key_owner, key_changeset_revision = get_components_from_key( key )
- key_prior_installation_required = False
- return repository_name == key_name and \
+ key_prior_installation_required = 'False'
+ key_only_if_compiling_contained_td = 'False'
+ if repository_name == key_name and \
repository_owner == key_owner and \
changeset_revision == key_changeset_revision and \
- prior_installation_required == key_prior_installation_required
+ prior_installation_required == key_prior_installation_required and \
+ only_if_compiling_contained_td == key_only_if_compiling_contained_td:
+ return True
+ return False
def populate_repository_dependencies_container( trans, repository_dependencies_folder, repository_dependencies, folder_id, repository_dependency_id ):
folder_keys = repository_dependencies.keys()
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/util/metadata_util.py
--- a/lib/tool_shed/util/metadata_util.py
+++ b/lib/tool_shed/util/metadata_util.py
@@ -199,19 +199,23 @@
def compare_repository_dependencies( trans, ancestor_repository_dependencies, current_repository_dependencies ):
"""Determine if ancestor_repository_dependencies is the same as or a subset of current_repository_dependencies."""
- # The list of repository_dependencies looks something like: [["http://localhost:9009", "emboss_datatypes", "test", "ab03a2a5f407", False]].
+ # The list of repository_dependencies looks something like:
+ # [["http://localhost:9009", "emboss_datatypes", "test", "ab03a2a5f407", "False", "False"]].
# Create a string from each tuple in the list for easier comparison.
if len( ancestor_repository_dependencies ) <= len( current_repository_dependencies ):
for ancestor_tup in ancestor_repository_dependencies:
- ancestor_tool_shed, ancestor_repository_name, ancestor_repository_owner, ancestor_changeset_revision, ancestor_prior_installation_required = ancestor_tup
+ a_tool_shed, a_repo_name, a_repo_owner, a_changeset_revision, a_prior_installation_required, a_only_if_compiling_contained_td = \
+ ancestor_tup
found_in_current = False
for current_tup in current_repository_dependencies:
- current_tool_shed, current_repository_name, current_repository_owner, current_changeset_revision, current_prior_installation_required = current_tup
- if current_tool_shed == ancestor_tool_shed and \
- current_repository_name == ancestor_repository_name and \
- current_repository_owner == ancestor_repository_owner and \
- current_changeset_revision == ancestor_changeset_revision and \
- util.string_as_bool( current_prior_installation_required ) == util.string_as_bool( ancestor_prior_installation_required ):
+ c_tool_shed, c_repo_name, c_repo_owner, c_changeset_revision, c_prior_installation_required, c_only_if_compiling_contained_td = \
+ current_tup
+ if c_tool_shed == a_tool_shed and \
+ c_repo_name == a_repo_name and \
+ c_repo_owner == a_repo_owner and \
+ c_changeset_revision == a_changeset_revision and \
+ util.string_as_bool( c_prior_installation_required ) == util.string_as_bool( a_prior_installation_required ) and \
+ util.string_as_bool( c_only_if_compiling_contained_td ) == util.string_as_bool( a_only_if_compiling_contained_td ):
found_in_current = True
break
if not found_in_current:
@@ -354,9 +358,11 @@
Determine if the only difference between rd_tup and a dependency definition in the list of repository_dependencies is the changeset_revision value.
"""
new_metadata_required = False
- rd_tool_shed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required = common_util.parse_repository_dependency_tuple( rd_tup )
+ rd_tool_shed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required, rd_only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( rd_tup )
for repository_dependency in repository_dependencies:
- tool_shed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( repository_dependency )
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( repository_dependency )
if rd_tool_shed == tool_shed and rd_name == name and rd_owner == owner:
# Determine if the repository represented by the dependency tuple is an instance of the repository type TipOnly.
required_repository = suc.get_repository_by_name_and_owner( trans.app, name, owner )
@@ -750,7 +756,9 @@
# We have a complex repository dependency. If the returned value of repository_dependency_is_valid is True, the tool
# dependency definition will be set as invalid. This is currently the only case where a tool dependency definition is
# considered invalid.
- repository_dependency_tup, repository_dependency_is_valid, error_message = handle_repository_elem( app=app, repository_elem=sub_elem )
+ repository_dependency_tup, repository_dependency_is_valid, error_message = handle_repository_elem( app=app,
+ repository_elem=sub_elem,
+ only_if_compiling_contained_td=False )
elif sub_elem.tag == 'install':
package_install_version = sub_elem.get( 'version', '1.0' )
if package_install_version == '1.0':
@@ -763,7 +771,8 @@
in_actions_group, actions_elem = actions_elem_tuple
if in_actions_group:
# Since we're inside an <actions_group> tag set, inspect the actions_elem to see if a complex repository dependency
- # is defined.
+ # is defined. By definition, complex repository dependency definitions contained within the last <actions> tag set
+ # within an <actions_group> tag set will have the value of "only_if_compiling_contained_td" set to True in
for action_elem in actions_elem:
if action_elem.tag == 'package':
# <package name="libgtextutils" version="0.6">
@@ -776,7 +785,9 @@
if sub_action_elem.tag == 'repository':
# We have a complex repository dependency.
repository_dependency_tup, repository_dependency_is_valid, error_message = \
- handle_repository_elem( app=app, repository_elem=sub_action_elem )
+ handle_repository_elem( app=app,
+ repository_elem=sub_action_elem,
+ only_if_compiling_contained_td=True )
if requirements_dict:
dependency_key = '%s/%s' % ( package_name, package_version )
if repository_dependency_is_valid:
@@ -806,13 +817,21 @@
valid_repository_dependencies_dict = dict( description=root.get( 'description' ) )
valid_repository_dependency_tups = []
for repository_elem in root.findall( 'repository' ):
- repository_dependency_tup, repository_dependency_is_valid, error_message = handle_repository_elem( app, repository_elem )
+ repository_dependency_tup, repository_dependency_is_valid, error_message = handle_repository_elem( app,
+ repository_elem,
+ only_if_compiling_contained_td=False )
if repository_dependency_is_valid:
valid_repository_dependency_tups.append( repository_dependency_tup )
else:
# Append the error_message to the repository dependencies tuple.
- toolshed, name, owner, changeset_revision, prior_installation_required = repository_dependency_tup
- repository_dependency_tup = ( toolshed, name, owner, changeset_revision, str( prior_installation_required ), error_message )
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = repository_dependency_tup
+ repository_dependency_tup = ( toolshed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ only_if_compiling_contained_td,
+ error_message )
invalid_repository_dependency_tups.append( repository_dependency_tup )
if invalid_repository_dependency_tups:
invalid_repository_dependencies_dict[ 'repository_dependencies' ] = invalid_repository_dependency_tups
@@ -859,8 +878,9 @@
# We have an invalid complex repository dependency, so mark the tool dependency as invalid.
tool_dependency_is_valid = False
# Append the error message to the invalid repository dependency tuple.
- toolshed, name, owner, changeset_revision, prior_installation_required = repository_dependency_tup
- repository_dependency_tup = ( toolshed, name, owner, changeset_revision, str( prior_installation_required ), message )
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = repository_dependency_tup
+ repository_dependency_tup = \
+ ( toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, message )
invalid_repository_dependency_tups.append( repository_dependency_tup )
error_message = '%s %s' % ( error_message, message )
elif elem.tag == 'set_environment':
@@ -1137,7 +1157,7 @@
deleted_tool_dependency_names.append( original_dependency_val_dict[ 'name' ] )
return updated_tool_dependency_names, deleted_tool_dependency_names
-def handle_repository_elem( app, repository_elem ):
+def handle_repository_elem( app, repository_elem, only_if_compiling_contained_td=False ):
"""
Process the received repository_elem which is a <repository> tag either from a repository_dependencies.xml file or a tool_dependencies.xml file.
If the former, we're generating repository dependencies metadata for a repository in the tool shed. If the latter, we're generating package
@@ -1155,7 +1175,7 @@
owner = repository_elem.get( 'owner' )
changeset_revision = repository_elem.get( 'changeset_revision' )
prior_installation_required = str( repository_elem.get( 'prior_installation_required', False ) )
- repository_dependency_tup = [ toolshed, name, owner, changeset_revision, str( prior_installation_required ) ]
+ repository_dependency_tup = [ toolshed, name, owner, changeset_revision, prior_installation_required, str( only_if_compiling_contained_td ) ]
user = None
repository = None
if app.name == 'galaxy':
@@ -1974,9 +1994,9 @@
repository_dependencies_dict = metadata.get( 'invalid_repository_dependencies', None )
for repository_dependency_tup in repository_dependency_tups:
if is_valid:
- tool_shed, name, owner, changeset_revision, prior_installation_required = repository_dependency_tup
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = repository_dependency_tup
else:
- tool_shed, name, owner, changeset_revision, prior_installation_required, error_message = repository_dependency_tup
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, error_message = repository_dependency_tup
if repository_dependencies_dict:
repository_dependencies = repository_dependencies_dict.get( 'repository_dependencies', [] )
for repository_dependency_tup in repository_dependency_tups:
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/util/repository_dependency_util.py
--- a/lib/tool_shed/util/repository_dependency_util.py
+++ b/lib/tool_shed/util/repository_dependency_util.py
@@ -33,30 +33,30 @@
for key, val in repository_dependencies.items():
if key in [ 'root_key', 'description' ]:
continue
- dependent_repository = None
+ d_repository = None
try:
- dependent_toolshed, dependent_name, dependent_owner, dependent_changeset_revision, dependent_prior_installation_required = \
+ d_toolshed, d_name, d_owner, d_changeset_revision, d_prior_installation_required, d_only_if_compiling_contained_td = \
container_util.get_components_from_key( key )
except ValueError:
# For backward compatibility to the 12/20/12 Galaxy release.
- dependent_toolshed, dependent_name, dependent_owner, dependent_changeset_revision = container_util.get_components_from_key( key )
+ d_toolshed, d_name, d_owner, d_changeset_revision = container_util.get_components_from_key( key )
for tsr in tool_shed_repositories:
# Get the the tool_shed_repository defined by name, owner and changeset_revision. This is the repository that will be
- # dependent upon each of the tool shed repositories contained in val.
- # TODO: Check tool_shed_repository.tool_shed as well when repository dependencies across tool sheds is supported.
- if tsr.name == dependent_name and tsr.owner == dependent_owner and tsr.changeset_revision == dependent_changeset_revision:
- dependent_repository = tsr
+ # dependent upon each of the tool shed repositories contained in val. We'll need to check tool_shed_repository.tool_shed
+ # as well if/when repository dependencies across tool sheds is supported.
+ if tsr.name == d_name and tsr.owner == d_owner and tsr.changeset_revision == d_changeset_revision:
+ d_repository = tsr
break
- if dependent_repository is None:
+ if d_repository is None:
# The dependent repository is not in the received list so look in the database.
- dependent_repository = suc.get_or_create_tool_shed_repository( trans, dependent_toolshed, dependent_name, dependent_owner, dependent_changeset_revision )
+ d_repository = suc.get_or_create_tool_shed_repository( trans, d_toolshed, d_name, d_owner, d_changeset_revision )
# Process each repository_dependency defined for the current dependent repository.
for repository_dependency_components_list in val:
required_repository = None
- rd_toolshed, rd_name, rd_owner, rd_changeset_revision, prior_installation_required = \
+ rd_toolshed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required, rd_only_if_compiling_contained_td = \
common_util.parse_repository_dependency_tuple( repository_dependency_components_list )
# Get the the tool_shed_repository defined by rd_name, rd_owner and rd_changeset_revision. This is the repository that will be
- # required by the current dependent_repository.
+ # required by the current d_repository.
# TODO: Check tool_shed_repository.tool_shed as well when repository dependencies across tool sheds is supported.
for tsr in tool_shed_repositories:
if tsr.name == rd_name and tsr.owner == rd_owner and tsr.changeset_revision == rd_changeset_revision:
@@ -65,9 +65,9 @@
if required_repository is None:
# The required repository is not in the received list so look in the database.
required_repository = suc.get_or_create_tool_shed_repository( trans, rd_toolshed, rd_name, rd_owner, rd_changeset_revision )
- # Ensure there is a repository_dependency relationship between dependent_repository and required_repository.
+ # Ensure there is a repository_dependency relationship between d_repository and required_repository.
rrda = None
- for rd in dependent_repository.repository_dependencies:
+ for rd in d_repository.repository_dependencies:
if rd.id == required_repository.id:
rrda = rd
break
@@ -78,8 +78,8 @@
repository_dependency = trans.model.RepositoryDependency( tool_shed_repository_id=required_repository.id )
trans.sa_session.add( repository_dependency )
trans.sa_session.flush()
- # Build the relationship between the dependent_repository and the required_repository.
- rrda = trans.model.RepositoryRepositoryDependencyAssociation( tool_shed_repository_id=dependent_repository.id,
+ # Build the relationship between the d_repository and the required_repository.
+ rrda = trans.model.RepositoryRepositoryDependencyAssociation( tool_shed_repository_id=d_repository.id,
repository_dependency_id=repository_dependency.id )
trans.sa_session.add( rrda )
trans.sa_session.flush()
@@ -257,40 +257,42 @@
if invalid_repository_dependencies_dict:
invalid_repository_dependencies = invalid_repository_dependencies_dict.get( 'invalid_repository_dependencies', [] )
for repository_dependency_tup in invalid_repository_dependencies:
- toolshed, name, owner, changeset_revision, prior_installation_required, error = \
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, error = \
common_util.parse_repository_dependency_tuple( repository_dependency_tup, contains_error=True )
if error:
message = '%s ' % str( error )
return message
def get_key_for_repository_changeset_revision( toolshed_base_url, repository, repository_metadata, all_repository_dependencies ):
- prior_installation_required = get_prior_installation_required_for_key( toolshed_base_url, repository, repository_metadata, all_repository_dependencies )
+ prior_installation_required, only_if_compiling_contained_td = \
+ get_prior_installation_required_and_only_if_compiling_contained_td( toolshed_base_url, repository, repository_metadata, all_repository_dependencies )
# Create a key with the value of prior_installation_required defaulted to False.
key = container_util.generate_repository_dependencies_key_for_repository( toolshed_base_url=toolshed_base_url,
repository_name=repository.name,
repository_owner=repository.user.username,
changeset_revision=repository_metadata.changeset_revision,
- prior_installation_required=prior_installation_required )
+ prior_installation_required=prior_installation_required,
+ only_if_compiling_contained_td=only_if_compiling_contained_td )
return key
-def get_prior_installation_required_for_key( toolshed_base_url, repository, repository_metadata, all_repository_dependencies ):
+def get_prior_installation_required_and_only_if_compiling_contained_td( toolshed_base_url, repository, repository_metadata, all_repository_dependencies ):
"""
If all_repository_dependencies contains a repository dependency tuple that is associated with the received repository, return the
value of the tuple's prior_installation_required component.
"""
- rd_tuple = ( toolshed_base_url, repository.name, repository.user.username, repository_metadata.changeset_revision )
for rd_key, rd_tups in all_repository_dependencies.items():
if rd_key in [ 'root_key', 'description' ]:
continue
for rd_tup in rd_tups:
- rd_toolshed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required = common_util.parse_repository_dependency_tuple( rd_tup )
+ rd_toolshed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required, rd_only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( rd_tup )
if rd_toolshed == toolshed_base_url and \
rd_name == repository.name and \
rd_owner == repository.user.username and \
rd_changeset_revision == repository_metadata.changeset_revision:
- return rd_prior_installation_required
- # Default prior_installation_required to False.
- return False
+ return rd_prior_installation_required, rd_only_if_compiling_contained_td
+ # Default both prior_installation_required and only_if_compiling_contained_td to False.
+ return 'False', 'False'
def get_repository_dependencies_for_installed_tool_shed_repository( trans, repository ):
"""
@@ -374,7 +376,7 @@
for key_rd_dict in key_rd_dicts:
key = key_rd_dict.keys()[ 0 ]
repository_dependency = key_rd_dict[ key ]
- rd_toolshed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required = \
+ rd_toolshed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required, rd_only_if_compiling_contained_td = \
common_util.parse_repository_dependency_tuple( repository_dependency )
if suc.tool_shed_is_this_tool_shed( rd_toolshed ):
repository = suc.get_repository_by_name_and_owner( trans.app, rd_name, rd_owner )
@@ -397,12 +399,13 @@
changeset_revision )
if repository_metadata:
new_key_rd_dict = {}
- new_key_rd_dict[ key ] = [ rd_toolshed, rd_name, rd_owner, repository_metadata.changeset_revision, str( rd_prior_installation_required ) ]
+ new_key_rd_dict[ key ] = \
+ [ rd_toolshed, rd_name, rd_owner, repository_metadata.changeset_revision, rd_prior_installation_required, rd_only_if_compiling_contained_td ]
# We have the updated changset revision.
updated_key_rd_dicts.append( new_key_rd_dict )
else:
try:
- toolshed, repository_name, repository_owner, repository_changeset_revision, prior_installation_required = \
+ toolshed, repository_name, repository_owner, repository_changeset_revision, prior_installation_required, rd_only_if_compiling_contained_td = \
container_util.get_components_from_key( key )
except ValueError:
# For backward compatibility to the 12/20/12 Galaxy release.
@@ -412,7 +415,7 @@
log.debug( message )
else:
try:
- toolshed, repository_name, repository_owner, repository_changeset_revision, prior_installation_required = \
+ toolshed, repository_name, repository_owner, repository_changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
container_util.get_components_from_key( key )
except ValueError:
# For backward compatibility to the 12/20/12 Galaxy release.
@@ -462,7 +465,8 @@
def handle_key_rd_dicts_for_repository( trans, current_repository_key, repository_key_rd_dicts, key_rd_dicts_to_be_processed, handled_key_rd_dicts, circular_repository_dependencies ):
key_rd_dict = repository_key_rd_dicts.pop( 0 )
repository_dependency = key_rd_dict[ current_repository_key ]
- toolshed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( repository_dependency )
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( repository_dependency )
if suc.tool_shed_is_this_tool_shed( toolshed ):
required_repository = suc.get_repository_by_name_and_owner( trans.app, name, owner )
required_repository_metadata = metadata_util.get_repository_metadata_by_repository_id_changeset_revision( trans,
@@ -691,11 +695,13 @@
clean_key_rd_dicts = []
key = key_rd_dicts[ 0 ].keys()[ 0 ]
repository_tup = key.split( container_util.STRSEP )
- rd_toolshed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required = common_util.parse_repository_dependency_tuple( repository_tup )
+ rd_toolshed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required, rd_only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( repository_tup )
for key_rd_dict in key_rd_dicts:
k = key_rd_dict.keys()[ 0 ]
repository_dependency = key_rd_dict[ k ]
- toolshed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( repository_dependency )
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( repository_dependency )
if rd_toolshed == toolshed and rd_name == name and rd_owner == owner:
log.debug( "Removing repository dependency for repository %s owned by %s since it refers to a revision within itself." % ( name, owner ) )
else:
@@ -705,8 +711,14 @@
return clean_key_rd_dicts
def get_repository_dependency_as_key( repository_dependency ):
- tool_shed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( repository_dependency )
- return container_util.generate_repository_dependencies_key_for_repository( tool_shed, name, owner, changeset_revision, str( prior_installation_required ) )
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( repository_dependency )
+ return container_util.generate_repository_dependencies_key_for_repository( tool_shed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ only_if_compiling_contained_td )
def get_repository_dependency_by_repository_id( trans, decoded_repository_id ):
return trans.sa_session.query( trans.model.RepositoryDependency ) \
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/util/shed_util_common.py
--- a/lib/tool_shed/util/shed_util_common.py
+++ b/lib/tool_shed/util/shed_util_common.py
@@ -325,7 +325,8 @@
def generate_clone_url_from_repo_info_tup( repo_info_tup ):
"""Generate teh URL for cloning a repositoyr given a tuple of toolshed, name, owner, changeset_revision."""
# Example tuple: ['http://localhost:9009', 'blast_datatypes', 'test', '461a4216e8ab', False]
- toolshed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( repo_info_tup )
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( repo_info_tup )
# Don't include the changeset_revision in clone urls.
return url_join( toolshed, 'repos', owner, name )
@@ -975,16 +976,24 @@
if key in [ 'description', 'root_key' ]:
continue
for rd_tup in rd_tups:
- tool_shed, name, owner, changeset_revision, prior_installation_required = common_util.parse_repository_dependency_tuple( rd_tup )
- if asbool( prior_installation_required ):
- if trans.webapp.name == 'galaxy':
- repository = get_repository_for_dependency_relationship( trans.app, tool_shed, name, owner, changeset_revision )
- else:
- repository = get_repository_by_name_and_owner( trans.app, name, owner )
- if repository:
- encoded_repository_id = trans.security.encode_id( repository.id )
- if encoded_repository_id in tsr_ids:
- prior_tsr_ids.append( encoded_repository_id )
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( rd_tup )
+ # If only_if_compiling_contained_td is False, then the repository dependency is not required to be installed prior to the dependent
+ # repository even if prior_installation_required is True. This is because the only meaningful content of the repository dependency
+ # is it's contained tool dependency, which is required in order to compile the dependent repository's tool dependency. In the scenario
+ # where the repository dependency is not installed prior to the dependent repository's tool dependency compilation process, the tool
+ # dependency compilation framework will install the repository dependency prior to compilation of the dependent repository's tool
+ # dependency.
+ if not asbool( only_if_compiling_contained_td ):
+ if asbool( prior_installation_required ):
+ if trans.webapp.name == 'galaxy':
+ repository = get_repository_for_dependency_relationship( trans.app, tool_shed, name, owner, changeset_revision )
+ else:
+ repository = get_repository_by_name_and_owner( trans.app, name, owner )
+ if repository:
+ encoded_repository_id = trans.security.encode_id( repository.id )
+ if encoded_repository_id in tsr_ids:
+ prior_tsr_ids.append( encoded_repository_id )
return prior_tsr_ids
def get_repository_in_tool_shed( trans, id ):
@@ -1425,9 +1434,21 @@
"""Return a reversed list of changesets in the repository changelog up to and including the included_upper_bounds_changeset_revision."""
return reversed_lower_upper_bounded_changelog( repo, INITIAL_CHANGELOG_HASH, included_upper_bounds_changeset_revision )
+def set_only_if_compiling_contained_td( repository, required_repository ):
+ """Return True if the received required_repository is only needed to compile a tool dependency defined for the received repository."""
+ # This method is called only from Galaxy when rendering repository dependencies for an installed tool shed repository.
+ # TODO: Do we need to check more than changeset_revision here?
+ required_repository_tup = [ required_repository.tool_shed, required_repository.name, required_repository.owner, required_repository.changeset_revision ]
+ for tup in repository.tuples_of_repository_dependencies_needed_for_compiling_td:
+ partial_tup = tup[ 0:4 ]
+ if partial_tup == required_repository_tup:
+ return 'True'
+ return 'False'
+
def set_prior_installation_required( repository, required_repository ):
"""Return True if the received required_repository must be installed before the received repository."""
# This method is called only from Galaxy when rendering repository dependencies for an installed tool shed repository.
+ # TODO: Do we need to check more than changeset_revision here?
required_repository_tup = [ required_repository.tool_shed, required_repository.name, required_repository.owner, required_repository.changeset_revision ]
# Get the list of repository dependency tuples associated with the received repository where prior_installation_required is True.
required_rd_tups_that_must_be_installed = repository.requires_prior_installation_of
@@ -1435,9 +1456,9 @@
# Repository dependency tuples in metadata include a prior_installation_required value, so strip it for comparision.
partial_required_rd_tup = required_rd_tup[ 0:4 ]
if partial_required_rd_tup == required_repository_tup:
- # Return the boolean value of prior_installation_required, which defaults to False.
- return required_rd_tup[ 4 ]
- return False
+ # Return the string value of prior_installation_required, which defaults to 'False'.
+ return str( required_rd_tup[ 4 ] )
+ return 'False'
def size_string( raw_text, size=MAX_DISPLAY_SIZE ):
"""Return a subset of a string (up to MAX_DISPLAY_SIZE) translated to a safe string for display in a browser."""
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 lib/tool_shed/util/tool_dependency_util.py
--- a/lib/tool_shed/util/tool_dependency_util.py
+++ b/lib/tool_shed/util/tool_dependency_util.py
@@ -347,15 +347,17 @@
get_installed_and_missing_tool_dependencies( trans, required_repository, tool_dependencies )
if required_repository_installed_tool_dependencies:
# Add the install_dir attribute to the tool_dependencies.
- required_repository_installed_tool_dependencies = add_installation_directories_to_tool_dependencies( trans=trans,
- tool_dependencies=required_repository_installed_tool_dependencies )
+ required_repository_installed_tool_dependencies = \
+ add_installation_directories_to_tool_dependencies( trans=trans,
+ tool_dependencies=required_repository_installed_tool_dependencies )
for td_key, td_dict in required_repository_installed_tool_dependencies.items():
if td_key not in repository_installed_tool_dependencies:
repository_installed_tool_dependencies[ td_key ] = td_dict
if required_repository_missing_tool_dependencies:
# Add the install_dir attribute to the tool_dependencies.
- required_repository_missing_tool_dependencies = add_installation_directories_to_tool_dependencies( trans=trans,
- tool_dependencies=required_repository_missing_tool_dependencies )
+ required_repository_missing_tool_dependencies = \
+ add_installation_directories_to_tool_dependencies( trans=trans,
+ tool_dependencies=required_repository_missing_tool_dependencies )
for td_key, td_dict in required_repository_missing_tool_dependencies.items():
if td_key not in repository_missing_tool_dependencies:
repository_missing_tool_dependencies[ td_key ] = td_dict
diff -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 -r ed3c786622741814a07589a1e642cca75c51c822 templates/webapps/tool_shed/repository/common.mako
--- a/templates/webapps/tool_shed/repository/common.mako
+++ b/templates/webapps/tool_shed/repository/common.mako
@@ -619,7 +619,7 @@
<%def name="render_repository_dependency( repository_dependency, pad, parent, row_counter, row_is_header=False )"><%
- from galaxy.util import string_as_bool
+ from galaxy.util import asbool
encoded_id = trans.security.encode_id( repository_dependency.id )
if trans.webapp.name == 'galaxy':
if repository_dependency.tool_shed_repository_id:
@@ -633,8 +633,7 @@
repository_name = str( repository_dependency.repository_name )
repository_owner = str( repository_dependency.repository_owner )
changeset_revision = str( repository_dependency.changeset_revision )
- prior_installation_required = string_as_bool( str( repository_dependency.prior_installation_required ) )
- if prior_installation_required:
+ if asbool( str( repository_dependency.prior_installation_required ) ):
prior_installation_required_str = " <i>(prior install required)</i>"
else:
prior_installation_required_str = ""
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Dave Bouvier: When the install and test framework has finished testing a repository, use deactivate instead of uninstall, so that installed tool dependencies are retained.
by commits-noreply@bitbucket.org 09 Sep '13
by commits-noreply@bitbucket.org 09 Sep '13
09 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/149dbd29acbc/
Changeset: 149dbd29acbc
User: Dave Bouvier
Date: 2013-09-09 19:44:03
Summary: When the install and test framework has finished testing a repository, use deactivate instead of uninstall, so that installed tool dependencies are retained.
Affected #: 1 file
diff -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 -r 149dbd29acbce6396a969662a9b899cdbb7bdb46 test/install_and_test_tool_shed_repositories/base/twilltestcase.py
--- a/test/install_and_test_tool_shed_repositories/base/twilltestcase.py
+++ b/test/install_and_test_tool_shed_repositories/base/twilltestcase.py
@@ -6,7 +6,9 @@
log = logging.getLogger( __name__ )
+
class InstallTestRepository( TwillTestCase ):
+
def setUp( self ):
# Security helper
id_secret = os.environ.get( 'GALAXY_INSTALL_TEST_SECRET', 'changethisinproductiontoo' )
@@ -24,21 +26,22 @@
self.galaxy_tool_dependency_dir = os.environ.get( 'GALAXY_INSTALL_TEST_TOOL_DEPENDENCY_DIR' )
self.shed_tools_dict = {}
self.home()
- def initiate_installation_process( self,
- install_tool_dependencies=False,
- install_repository_dependencies=True,
- no_changes=True,
+
+ def initiate_installation_process( self,
+ install_tool_dependencies=False,
+ install_repository_dependencies=True,
+ no_changes=True,
new_tool_panel_section=None ):
html = self.last_page()
- # Since the installation process is by necessity asynchronous, we have to get the parameters to 'manually' initiate the
- # installation process. This regex will return the tool shed repository IDs in group(1), the encoded_kwd parameter in
- # group(2), and the reinstalling flag in group(3) and pass them to the manage_repositories method in the Galaxy
+ # Since the installation process is by necessity asynchronous, we have to get the parameters to 'manually' initiate the
+ # installation process. This regex will return the tool shed repository IDs in group(1), the encoded_kwd parameter in
+ # group(2), and the reinstalling flag in group(3) and pass them to the manage_repositories method in the Galaxy
# admin_toolshed controller.
install_parameters = re.search( 'initiate_repository_installation\( "([^"]+)", "([^"]+)", "([^"]+)" \);', html )
if install_parameters:
iri_ids = install_parameters.group(1)
# In some cases, the returned iri_ids are of the form: "[u'<encoded id>', u'<encoded id>']"
- # This regex ensures that non-hex characters are stripped out of the list, so that util.listify/decode_id
+ # This regex ensures that non-hex characters are stripped out of the list, so that util.listify/decode_id
# will handle them correctly. It's safe to pass the cleaned list to manage_repositories, because it can parse
# comma-separated values.
repository_ids = str( iri_ids )
@@ -49,8 +52,9 @@
( ','.join( util.listify( repository_ids ) ), encoded_kwd, reinstalling )
self.visit_url( url )
return util.listify( repository_ids )
- def install_repository( self, repository_info_dict, install_tool_dependencies=True, install_repository_dependencies=True,
- strings_displayed=[], strings_not_displayed=[], preview_strings_displayed=[],
+
+ def install_repository( self, repository_info_dict, install_tool_dependencies=True, install_repository_dependencies=True,
+ strings_displayed=[], strings_not_displayed=[], preview_strings_displayed=[],
post_submit_strings_displayed=[], new_tool_panel_section=None, **kwd ):
name = repository_info_dict[ 'name' ]
owner = repository_info_dict[ 'owner' ]
@@ -59,23 +63,23 @@
tool_shed_url = repository_info_dict[ 'tool_shed_url' ]
preview_params = urllib.urlencode( dict( repository_id=encoded_repository_id, changeset_revision=changeset_revision ) )
self.visit_url( '%s/repository/preview_tools_in_changeset?%s' % ( tool_shed_url, preview_params ) )
- install_params = urllib.urlencode( dict( repository_ids=encoded_repository_id,
+ install_params = urllib.urlencode( dict( repository_ids=encoded_repository_id,
changeset_revisions=changeset_revision,
galaxy_url=self.url ) )
- # If the tool shed does not have the same hostname as the Galaxy server being used for these tests,
- # twill will not carry over previously set cookies for the Galaxy server when following the
- # install_repositories_by_revision redirect, so we have to include 403 in the allowed HTTP
+ # If the tool shed does not have the same hostname as the Galaxy server being used for these tests,
+ # twill will not carry over previously set cookies for the Galaxy server when following the
+ # install_repositories_by_revision redirect, so we have to include 403 in the allowed HTTP
# status codes and log in again.
url = '%s/repository/install_repositories_by_revision?%s' % ( tool_shed_url, install_params )
self.visit_url( url, allowed_codes=[ 200, 403 ] )
self.logout()
self.login( email='test(a)bx.psu.edu', username='test' )
- install_params = urllib.urlencode( dict( repository_ids=encoded_repository_id,
+ install_params = urllib.urlencode( dict( repository_ids=encoded_repository_id,
changeset_revisions=changeset_revision,
tool_shed_url=tool_shed_url ) )
url = '/admin_toolshed/prepare_for_install?%s' % install_params
self.visit_url( url )
- # This section is tricky, due to the way twill handles form submission. The tool dependency checkbox needs to
+ # This section is tricky, due to the way twill handles form submission. The tool dependency checkbox needs to
# be hacked in through tc.browser, putting the form field in kwd doesn't work.
if 'install_tool_dependencies' in self.last_page():
form = tc.browser.get_form( 'select_tool_panel_section' )
@@ -105,12 +109,14 @@
self.check_for_strings( post_submit_strings_displayed, strings_not_displayed )
repository_ids = self.initiate_installation_process( new_tool_panel_section=new_tool_panel_section )
self.wait_for_repository_installation( repository_ids )
+
def visit_url( self, url, allowed_codes=[ 200 ] ):
new_url = tc.go( url )
return_code = tc.browser.get_code()
assert return_code in allowed_codes, 'Invalid HTTP return code %s, allowed codes: %s' % \
- ( return_code, ', '.join( str( code ) for code in allowed_codes ) )
+ ( return_code, ', '.join( str( code ) for code in allowed_codes ) )
return new_url
+
def wait_for_repository_installation( self, repository_ids ):
final_states = [ model.ToolShedRepository.installation_status.ERROR,
model.ToolShedRepository.installation_status.INSTALLED ]
@@ -129,12 +135,12 @@
( timeout_counter, repository.status ) )
break
time.sleep( 1 )
+
def uninstall_repository( self, installed_repository ):
url = '/admin_toolshed/deactivate_or_uninstall_repository?id=%s' % self.security.encode_id( installed_repository.id )
self.visit_url( url )
- tc.fv ( 1, "remove_from_disk", 'true' )
+ tc.fv ( 1, "remove_from_disk", 'false' )
tc.submit( 'deactivate_or_uninstall_repository_button' )
strings_displayed = [ 'The repository named' ]
- strings_displayed.append( 'has been uninstalled' )
+ strings_displayed.append( 'has been deactivated' )
self.check_for_strings( strings_displayed, strings_not_displayed=[] )
-
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: marten: reworked some jobs from the reports webapp and modified templates accordingly
by commits-noreply@bitbucket.org 09 Sep '13
by commits-noreply@bitbucket.org 09 Sep '13
09 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/1a868f109ed9/
Changeset: 1a868f109ed9
User: marten
Date: 2013-09-09 19:24:15
Summary: reworked some jobs from the reports webapp and modified templates accordingly
forces NO session for the DB because it is not needed
Affected #: 10 files
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 lib/galaxy/web/framework/__init__.py
--- a/lib/galaxy/web/framework/__init__.py
+++ b/lib/galaxy/web/framework/__init__.py
@@ -341,6 +341,8 @@
self.__user = None
self.galaxy_session = None
self.error_message = None
+
+
if self.environ.get('is_api_request', False):
# With API requests, if there's a key, use it and associate the
# user with the transaction.
@@ -348,6 +350,8 @@
# If an error message is set here, it's sent back using
# trans.show_error in the response -- in expose_api.
self.error_message = self._authenticate_api( session_cookie )
+ elif self.app.name == "reports":
+ self.galaxy_session = None
else:
#This is a web request, get or create session.
self._ensure_valid_session( session_cookie )
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 lib/galaxy/webapps/reports/controllers/jobs.py
--- a/lib/galaxy/webapps/reports/controllers/jobs.py
+++ b/lib/galaxy/webapps/reports/controllers/jobs.py
@@ -47,26 +47,23 @@
return query
# We are either filtering on a date like YYYY-MM-DD or on a month like YYYY-MM,
# so we need to figure out which type of date we have
- if column_filter.count( '-' ) == 2:
- # We are filtering on a date like YYYY-MM-DD
+ if column_filter.count( '-' ) == 2: # We are filtering on a date like YYYY-MM-DD
year, month, day = map( int, column_filter.split( "-" ) )
start_date = date( year, month, day )
end_date = start_date + timedelta( days=1 )
- return query.filter( and_( self.model_class.table.c.create_time >= start_date,
- self.model_class.table.c.create_time < end_date ) )
- if column_filter.count( '-' ) == 1:
- # We are filtering on a month like YYYY-MM
+ if column_filter.count( '-' ) == 1: # We are filtering on a month like YYYY-MM
year, month = map( int, column_filter.split( "-" ) )
start_date = date( year, month, 1 )
end_date = start_date + timedelta( days=calendar.monthrange( year, month )[1] )
- return query.filter( and_( self.model_class.table.c.create_time >= start_date,
+
+ return query.filter( and_( self.model_class.table.c.create_time >= start_date,
self.model_class.table.c.create_time < end_date ) )
# Grid definition
use_async = False
model_class = model.Job
title = "Jobs"
- template='/webapps/reports/grid.mako'
+ template = '/webapps/reports/grid.mako'
default_sort_key = "id"
columns = [
JobIdColumn( "Id",
@@ -112,12 +109,21 @@
num_rows_per_page = 50
preserve_state = False
use_paging = True
+
def build_initial_query( self, trans, **kwd ):
+ params = util.Params( kwd )
+ monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
+ monitor_user_id = get_monitor_id( trans, monitor_email )
return trans.sa_session.query( self.model_class ) \
.join( model.User ) \
+ .filter( model.Job.table.c.user_id != monitor_user_id )\
.enable_eagerloads( False )
class Jobs( BaseUIController ):
+ """
+ Class contains functions for querying data requested by user via the webapp. It exposes the functions and
+ responds to requests with the filled .mako templates.
+ """
specified_date_list_grid = SpecifiedDateListGrid()
@@ -160,7 +166,7 @@
if job.user:
kwd[ 'email' ] = job.user.email
else:
- kwd[ 'email' ] = None # For anonymous users
+ kwd[ 'email' ] = None # For anonymous users
return trans.response.send_redirect( web.url_for( controller='jobs',
action='user_per_month',
**kwd ) )
@@ -169,11 +175,20 @@
elif operation == "unfinished":
kwd[ 'f-state' ] = 'Unfinished'
return self.specified_date_list_grid( trans, **kwd )
+
@web.expose
def specified_month_all( self, trans, **kwd ):
+ """
+ Queries the DB for all jobs in given month, defaults to current month.
+ """
+ message = ''
+
params = util.Params( kwd )
- message = ''
monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
+
+ # In case we don't know which is the monitor user we will query for all jobs
+ monitor_user_id = get_monitor_id( trans, monitor_email )
+
# If specified_date is not received, we'll default to the current month
specified_date = kwd.get( 'specified_date', datetime.utcnow().strftime( "%Y-%m-%d" ) )
specified_month = specified_date[ :7 ]
@@ -182,32 +197,43 @@
end_date = start_date + timedelta( days=calendar.monthrange( year, month )[1] )
month_label = start_date.strftime( "%B" )
year_label = start_date.strftime( "%Y" )
- q = sa.select( ( sa.func.date( model.Job.table.c.create_time ).label( 'date' ),
- sa.func.sum( sa.case( [ ( model.User.table.c.email == monitor_email, 1 ) ], else_=0 ) ).label( 'monitor_jobs' ),
- sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- whereclause = sa.and_( model.Job.table.c.create_time >= start_date,
- model.Job.table.c.create_time < end_date ),
- from_obj = [ sa.outerjoin( model.Job.table, model.User.table ) ],
- group_by = [ 'date' ],
- order_by = [ sa.desc( 'date' ) ] )
+
+
+ month_jobs = sa.select( ( sa.func.date( model.Job.table.c.create_time ).label( 'date' ),
+ sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
+ whereclause=sa.and_( model.Job.table.c.user_id != monitor_user_id,
+ model.Job.table.c.create_time >= start_date,
+ model.Job.table.c.create_time < end_date ),
+ from_obj=[ model.Job.table ],
+ group_by=[ 'date' ],
+ order_by=[ sa.desc( 'date' ) ] )
+
jobs = []
- for row in q.execute():
+ for row in month_jobs.execute():
jobs.append( ( row.date.strftime( "%A" ),
- row.date,
- row.total_jobs - row.monitor_jobs,
- row.monitor_jobs,
+ row.date.strftime( "%d" ),
row.total_jobs,
- row.date.strftime( "%d" ) ) )
+ row.date
+ ) )
return trans.fill_template( '/webapps/reports/jobs_specified_month_all.mako',
month_label=month_label,
year_label=year_label,
month=month,
jobs=jobs,
+ is_user_jobs_only=monitor_user_id,
message=message )
@web.expose
def specified_month_in_error( self, trans, **kwd ):
+ """
+ Queries the DB for the user jobs in error.
+ """
+ message = ''
params = util.Params( kwd )
- message = ''
+ monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
+
+ # In case we don't know which is the monitor user we will query for all jobs instead
+ monitor_user_id = get_monitor_id( trans, monitor_email )
+
# If specified_date is not received, we'll default to the current month
specified_date = kwd.get( 'specified_date', datetime.utcnow().strftime( "%Y-%m-%d" ) )
specified_month = specified_date[ :7 ]
@@ -216,16 +242,19 @@
end_date = start_date + timedelta( days=calendar.monthrange( year, month )[1] )
month_label = start_date.strftime( "%B" )
year_label = start_date.strftime( "%Y" )
- q = sa.select( ( sa.func.date( model.Job.table.c.create_time ).label( 'date' ),
- sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- whereclause = sa.and_( model.Job.table.c.state == 'error',
- model.Job.table.c.create_time >= start_date,
- model.Job.table.c.create_time < end_date ),
- from_obj = [ sa.outerjoin( model.Job.table, model.User.table ) ],
- group_by = [ 'date' ],
- order_by = [ sa.desc( 'date' ) ] )
+
+ month_jobs_in_error = sa.select( ( sa.func.date( model.Job.table.c.create_time ).label( 'date' ),
+ sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
+ whereclause=sa.and_( model.Job.table.c.user_id != monitor_user_id,
+ model.Job.table.c.state == 'error',
+ model.Job.table.c.create_time >= start_date,
+ model.Job.table.c.create_time < end_date ),
+ from_obj=[ model.Job.table ],
+ group_by=[ 'date' ],
+ order_by=[ sa.desc( 'date' ) ] )
+
jobs = []
- for row in q.execute():
+ for row in month_jobs_in_error.execute():
jobs.append( ( row.date.strftime( "%A" ),
row.date,
row.total_jobs,
@@ -235,62 +264,98 @@
year_label=year_label,
month=month,
jobs=jobs,
- message=message )
+ message=message,
+ is_user_jobs_only=monitor_user_id )
@web.expose
def per_month_all( self, trans, **kwd ):
+ """
+ Queries the DB for all jobs. Avoids monitor jobs.
+ """
+
+ message = ''
params = util.Params( kwd )
- message = ''
monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
- q = sa.select( ( sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ).label( 'date' ),
- sa.func.sum( sa.case( [( model.User.table.c.email == monitor_email, 1 )], else_=0 ) ).label( 'monitor_jobs' ),
- sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- from_obj = [ sa.outerjoin( model.Job.table, model.User.table ) ],
- group_by = [ sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ) ],
- order_by = [ sa.desc( 'date' ) ] )
+
+ # In case we don't know which is the monitor user we will query for all jobs
+ monitor_user_id = get_monitor_id( trans, monitor_email )
+
+ jobs_by_month = sa.select( ( sa.func.date_trunc( 'month', model.Job.table.c.create_time ).label( 'date' ),
+ sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
+ whereclause=model.Job.table.c.user_id != monitor_user_id,
+ from_obj=[ model.Job.table ],
+ group_by=[ 'date' ],
+ order_by=[ sa.desc( 'date' ) ] )
+
jobs = []
- for row in q.execute():
- jobs.append( ( row.date.strftime( "%Y-%m" ),
- row.total_jobs - row.monitor_jobs,
- row.monitor_jobs,
+ for row in jobs_by_month.execute():
+ jobs.append( (
+ row.date.strftime( "%Y-%m" ),
row.total_jobs,
row.date.strftime( "%B" ),
- row.date.strftime( "%Y" ) ) )
+ row.date.strftime( "%y" )
+ ) )
+
return trans.fill_template( '/webapps/reports/jobs_per_month_all.mako',
jobs=jobs,
+ is_user_jobs_only=monitor_user_id,
message=message )
@web.expose
def per_month_in_error( self, trans, **kwd ):
+ """
+ Queries the DB for user jobs in error. Filters out monitor jobs.
+ """
+
+ message = ''
params = util.Params( kwd )
- message = ''
- q = sa.select( ( sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ).label( 'date' ),
- sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- whereclause = model.Job.table.c.state == 'error',
- from_obj = [ model.Job.table ],
- group_by = [ sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ) ],
- order_by = [ sa.desc( 'date' ) ] )
+ monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
+
+ # In case we don't know which is the monitor user we will query for all jobs
+ monitor_user_id = get_monitor_id( trans, monitor_email )
+
+ jobs_in_error_by_month = sa.select( ( sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ).label( 'date' ),
+ sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
+ whereclause=sa.and_ ( model.Job.table.c.state == 'error',
+ model.Job.table.c.user_id != monitor_user_id ),
+ from_obj=[ model.Job.table ],
+ group_by=[ sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ) ],
+ order_by=[ sa.desc( 'date' ) ] )
+
jobs = []
- for row in q.execute():
+ for row in jobs_in_error_by_month.execute():
jobs.append( ( row.date.strftime( "%Y-%m" ),
row.total_jobs,
row.date.strftime( "%B" ),
- row.date.strftime( "%Y" ) ) )
+ row.date.strftime( "%y" ) ) )
return trans.fill_template( '/webapps/reports/jobs_per_month_in_error.mako',
jobs=jobs,
- message=message )
+ message=message,
+ is_user_jobs_only=monitor_user_id )
+
@web.expose
def per_user( self, trans, **kwd ):
params = util.Params( kwd )
message = ''
+ monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
+
jobs = []
- q = sa.select( ( model.User.table.c.email.label( 'user_email' ),
+ jobs_per_user = sa.select( ( model.User.table.c.email.label( 'user_email' ),
sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- from_obj = [ sa.outerjoin( model.Job.table, model.User.table ) ],
- group_by = [ 'user_email' ],
- order_by = [ sa.desc( 'total_jobs' ), 'user_email' ] )
- for row in q.execute():
- jobs.append( ( row.user_email,
+ from_obj=[ sa.outerjoin( model.Job.table, model.User.table ) ],
+ group_by=[ 'user_email' ],
+ order_by=[ sa.desc( 'total_jobs' ), 'user_email' ] )
+ for row in jobs_per_user.execute():
+ if ( row.user_email == None ):
+ jobs.append ( ( 'Anonymous',
+ row.total_jobs ) )
+ elif ( row.user_email == monitor_email ):
+ continue
+ else:
+ jobs.append( ( row.user_email,
row.total_jobs ) )
- return trans.fill_template( '/webapps/reports/jobs_per_user.mako', jobs=jobs, message=message )
+ return trans.fill_template( '/webapps/reports/jobs_per_user.mako',
+ jobs=jobs,
+ message=message )
+
@web.expose
def user_per_month( self, trans, **kwd ):
params = util.Params( kwd )
@@ -298,13 +363,13 @@
email = util.restore_text( params.get( 'email', '' ) )
q = sa.select( ( sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ).label( 'date' ),
sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- whereclause = sa.and_( model.Job.table.c.session_id == model.GalaxySession.table.c.id,
+ whereclause=sa.and_( model.Job.table.c.session_id == model.GalaxySession.table.c.id,
model.GalaxySession.table.c.user_id == model.User.table.c.id,
model.User.table.c.email == email
),
- from_obj = [ sa.join( model.Job.table, model.User.table ) ],
- group_by = [ sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ) ],
- order_by = [ sa.desc( 'date' ) ] )
+ from_obj=[ sa.join( model.Job.table, model.User.table ) ],
+ group_by=[ sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ) ],
+ order_by=[ sa.desc( 'date' ) ] )
jobs = []
for row in q.execute():
jobs.append( ( row.date.strftime( "%Y-%m" ),
@@ -314,34 +379,51 @@
return trans.fill_template( '/webapps/reports/jobs_user_per_month.mako',
email=util.sanitize_text( email ),
jobs=jobs, message=message )
+
@web.expose
def per_tool( self, trans, **kwd ):
+ message = ''
+
params = util.Params( kwd )
- message = ''
+ monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
+
+ # In case we don't know which is the monitor user we will query for all jobs
+ monitor_user_id = get_monitor_id( trans, monitor_email )
+
jobs = []
q = sa.select( ( model.Job.table.c.tool_id.label( 'tool_id' ),
sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- from_obj = [ model.Job.table ],
- group_by = [ 'tool_id' ],
- order_by = [ 'tool_id' ] )
+ whereclause=model.Job.table.c.user_id != monitor_user_id,
+ from_obj=[ model.Job.table ],
+ group_by=[ 'tool_id' ],
+ order_by=[ 'tool_id' ] )
for row in q.execute():
jobs.append( ( row.tool_id,
row.total_jobs ) )
return trans.fill_template( '/webapps/reports/jobs_per_tool.mako',
jobs=jobs,
- message=message )
+ message=message,
+ is_user_jobs_only=monitor_user_id )
+
@web.expose
def tool_per_month( self, trans, **kwd ):
+ message = ''
+
params = util.Params( kwd )
- message = ''
+ monitor_email = params.get( 'monitor_email', 'monitor(a)bx.psu.edu' )
+
+ # In case we don't know which is the monitor user we will query for all jobs
+ monitor_user_id = get_monitor_id( trans, monitor_email )
+
tool_id = params.get( 'tool_id', 'Add a column1' )
specified_date = params.get( 'specified_date', datetime.utcnow().strftime( "%Y-%m-%d" ) )
q = sa.select( ( sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ).label( 'date' ),
sa.func.count( model.Job.table.c.id ).label( 'total_jobs' ) ),
- whereclause = model.Job.table.c.tool_id == tool_id,
- from_obj = [ model.Job.table ],
- group_by = [ sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ) ],
- order_by = [ sa.desc( 'date' ) ] )
+ whereclause=sa.and_( model.Job.table.c.tool_id == tool_id,
+ model.Job.table.c.user_id != monitor_user_id ),
+ from_obj=[ model.Job.table ],
+ group_by=[ sa.func.date_trunc( 'month', sa.func.date( model.Job.table.c.create_time ) ) ],
+ order_by=[ sa.desc( 'date' ) ] )
jobs = []
for row in q.execute():
jobs.append( ( row.date.strftime( "%Y-%m" ),
@@ -352,7 +434,9 @@
specified_date=specified_date,
tool_id=tool_id,
jobs=jobs,
- message=message )
+ message=message,
+ is_user_jobs_only=monitor_user_id )
+
@web.expose
def job_info( self, trans, **kwd ):
params = util.Params( kwd )
@@ -362,49 +446,20 @@
return trans.fill_template( '/webapps/reports/job_info.mako',
job=job,
message=message )
- @web.expose
- def per_domain( self, trans, **kwd ):
- # TODO: rewrite using alchemy
- params = util.Params( kwd )
- message = ''
- engine = model.mapping.metadata.engine
- jobs = []
- s = """
- SELECT
- substr(bar.first_pass_domain, bar.dot_position, 4) AS domain,
- count(job_id) AS total_jobs
- FROM
- (SELECT
- user_email AS user_email,
- first_pass_domain,
- position('.' in first_pass_domain) AS dot_position,
- job_id AS job_id
- FROM
- (SELECT
- email AS user_email,
- substr(email, char_length(email)-3, char_length(email)) AS first_pass_domain,
- job.id AS job_id
- FROM
- job
- LEFT OUTER JOIN galaxy_session ON galaxy_session.id = job.session_id
- LEFT OUTER JOIN galaxy_user ON galaxy_session.user_id = galaxy_user.id
- WHERE
- job.session_id = galaxy_session.id
- AND
- galaxy_session.user_id = galaxy_user.id
- ) AS foo
- ) AS bar
- GROUP BY
- domain
- ORDER BY
- total_jobs DESC
- """
- job_rows = engine.text( s ).execute().fetchall()
- for job in job_rows:
- jobs.append( ( job.domain, job.total_jobs ) )
- return trans.fill_template( '/webapps/reports/jobs_per_domain.mako', jobs=jobs, message=message )
## ---- Utility methods -------------------------------------------------------
def get_job( trans, id ):
return trans.sa_session.query( trans.model.Job ).get( trans.security.decode_id( id ) )
+
+def get_monitor_id( trans, monitor_email ):
+ """
+ A convenience method to obtain the monitor job id.
+ """
+ monitor_user_id = None
+ monitor_row = trans.sa_session.query( trans.model.User.table.c.id ) \
+ .filter( trans.model.User.table.c.email == monitor_email ) \
+ .first()
+ if monitor_row is not None:
+ monitor_user_id = monitor_row[0]
+ return monitor_user_id
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_per_domain.mako
--- a/templates/webapps/reports/jobs_per_domain.mako
+++ /dev/null
@@ -1,35 +0,0 @@
-<%inherit file="/base.mako"/>
-<%namespace file="/message.mako" import="render_msg" />
-
-%if message:
- ${render_msg( message, 'done' )}
-%endif
-
-<div class="toolForm">
- <div class="toolFormBody">
- <h3 align="center">Jobs Per Domain</h3>
- <h4 align="center">This report does not account for unauthenticated users.</h4>
- <table align="center" width="60%" class="colored">
- %if len( jobs ) == 0:
- <tr><td colspan="2">There are no jobs</td></tr>
- %else:
- <tr class="header">
- <td>Domain</td>
- <td>Total Jobs</td>
- </tr>
- <% ctr = 0 %>
- %for job in jobs:
- %if ctr % 2 == 1:
- <tr class="odd_row">
- %else:
- <tr class="tr">
- %endif
- <td>${job[0]}</td>
- <td>${job[1]}</td>
- </tr>
- <% ctr += 1 %>
- %endfor
- %endif
- </table>
- </div>
-</div>
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_per_month_all.mako
--- a/templates/webapps/reports/jobs_per_month_all.mako
+++ b/templates/webapps/reports/jobs_per_month_all.mako
@@ -7,17 +7,19 @@
<div class="toolForm"><div class="toolFormBody">
- <h3 align="center">Jobs Per Month</h3>
- <h4 align="center">Click Month to view the number of jobs for each day of that month</h4>
+ <h4 align="center">Jobs Per Month</h4>
+ <h5 align="center">Click Month to view details.</h5><table align="center" width="60%" class="colored">
%if len( jobs ) == 0:
- <tr><td colspan="4">There are no jobs</td></tr>
+ <tr><td colspan="4">There are no jobs.</td></tr>
%else:
<tr class="header"><td>Month</td>
- <td>User Jobs</td>
- <td>Monitor Jobs</td>
- <td>Total</td>
+ %if is_user_jobs_only:
+ <td>User Jobs</td>
+ %else:
+ <td>User + Monitor Jobs</td>
+ %endif
</tr><% ctr = 0 %>
%for job in jobs:
@@ -26,10 +28,8 @@
%else:
<tr class="tr">
%endif
- <td><a href="${h.url_for( controller='jobs', action='specified_month_all', specified_date=job[0]+'-01' )}">${job[4]} ${job[5]}</a></td>
+ <td><a href="${h.url_for( controller='jobs', action='specified_month_all', specified_date=job[0]+'-01' )}">${job[2]} ${job[3]}</a></td><td>${job[1]}</td>
- <td>${job[2]}</td>
- <td>${job[3]}</td></tr><% ctr += 1 %>
%endfor
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_per_month_in_error.mako
--- a/templates/webapps/reports/jobs_per_month_in_error.mako
+++ b/templates/webapps/reports/jobs_per_month_in_error.mako
@@ -7,15 +7,19 @@
<div class="toolForm"><div class="toolFormBody">
- <h3 align="center">Jobs In Error Per Month</h3>
- <h4 align="center">Click Month to view the number of jobs in error for each day of that month</h4>
+ <h4 align="center">Jobs In Error Per Month</h4>
+ <h5 align="center">Click Month to view details.</h5><table align="center" width="60%" class="colored">
%if len( jobs ) == 0:
- <tr><td colspan="2">There are no jobs in the error state</td></tr>
+ <tr><td colspan="2">There are no jobs in the error state.</td></tr>
%else:
<tr class="header"><td>Month</td>
- <td>Jobs In Error</td>
+ %if is_user_jobs_only:
+ <td>User Jobs in Error</td>
+ %else:
+ <td>User + Monitor Jobs in Error</td>
+ %endif
</tr><% ctr = 0 %>
%for job in jobs:
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_per_tool.mako
--- a/templates/webapps/reports/jobs_per_tool.mako
+++ b/templates/webapps/reports/jobs_per_tool.mako
@@ -7,15 +7,19 @@
<div class="toolForm"><div class="toolFormBody">
- <h3 align="center">Jobs Per Tool</h3>
- <h4 align="center">Click Tool Id to view jobs per month for that tool</h4>
+ <h4 align="center">Jobs Per Tool</h4>
+ <h5 align="center">Click Tool Id to view details</h5><table align="center" width="60%" class="colored">
%if len( jobs ) == 0:
- <tr><td colspan="2">There are no jobs</td></tr>
+ <tr><td colspan="2">There are no jobs.</td></tr>
%else:
<tr class="header"><td>Tool id</td>
- <td>Total Jobs</td>
+ %if is_user_jobs_only:
+ <td>User Jobs</td>
+ %else:
+ <td>User + Monitor Jobs</td>
+ %endif
</tr><% ctr = 0 %>
%for job in jobs:
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_per_user.mako
--- a/templates/webapps/reports/jobs_per_user.mako
+++ b/templates/webapps/reports/jobs_per_user.mako
@@ -7,11 +7,11 @@
<div class="toolForm"><div class="toolFormBody">
- <h3 align="center">Jobs Per User</h3>
- <h4 align="center">Click User to view jobs per month for that user</h4>
+ <h4 align="center">Jobs Per User</h4>
+ <h5 align="center">Click User to view details.</h5><table align="center" width="60%" class="colored">
%if len( jobs ) == 0:
- <tr><td colspan="2">There are no jobs</td></tr>
+ <tr><td colspan="2">There are no jobs.</td></tr>
%else:
<tr class="header"><td>User</td>
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_specified_month_all.mako
--- a/templates/webapps/reports/jobs_specified_month_all.mako
+++ b/templates/webapps/reports/jobs_specified_month_all.mako
@@ -7,8 +7,8 @@
<div class="toolForm"><div class="toolFormBody">
- <h3 align="center">All Jobs for ${month_label} ${year_label}</h3>
- <h4 align="center">Click Total Jobs to see jobs for that day</h4>
+ <h4 align="center">Jobs for ${month_label} ${year_label}</h4>
+ <h5 align="center">Click Jobs to see their details</h5><table align="center" width="60%" class="colored">
%if len( jobs ) == 0:
<tr><td colspan="5">There are no jobs for ${month_label} ${year_label}</td></tr>
@@ -16,9 +16,11 @@
<tr class="header"><td>Day</td><td>Date</td>
- <td>User Jobs</td>
- <td>Monitor Jobs</td>
- <td>Total Jobs</td>
+ %if is_user_jobs_only:
+ <td>User Jobs</td>
+ %else:
+ <td>User + Monitor Jobs</td>
+ %endif
</tr><% ctr = 0 %>
%for job in jobs:
@@ -28,10 +30,8 @@
<tr class="tr">
%endif
<td>${job[0]}</td>
- <td>${month_label} ${job[5]}, ${year_label}</td>
- <td>${job[2]}</td>
- <td>${job[3]}</td>
- <td><a href="${h.url_for( controller='jobs', action='specified_date_handler', specified_date=job[1], webapp='reports' )}">${job[4]}</a></td>
+ <td>${month_label} ${job[1]}, ${year_label}</td>
+ <td><a href="${h.url_for( controller='jobs', action='specified_date_handler', specified_date=job[3], webapp='reports' )}">${job[2]}</a></td></tr><% ctr += 1 %>
%endfor
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_specified_month_in_error.mako
--- a/templates/webapps/reports/jobs_specified_month_in_error.mako
+++ b/templates/webapps/reports/jobs_specified_month_in_error.mako
@@ -7,8 +7,8 @@
<div class="toolForm"><div class="toolFormBody">
- <h3 align="center">Jobs in Error for ${month_label} ${year_label}</h3>
- <h4 align="center">Click Jobs in Error to view jobs in error for that day</h4>
+ <h4 align="center">Jobs in Error for ${month_label} ${year_label}</h4>
+ <h5 align="center">Click Jobs to see their details</h5><table align="center" width="60%" class="colored">
%if len( jobs ) == 0:
<tr><td colspan="3">There are no jobs in the error state for ${month_label} ${year_label}</td></tr>
@@ -16,7 +16,11 @@
<tr class="header"><td>Day</td><td>Date</td>
- <td>Jobs in Error</td>
+ %if is_user_jobs_only:
+ <td>User Jobs in Error</td>
+ %else:
+ <td>User + Monitor Jobs in Error</td>
+ %endif
</tr><% ctr = 0 %>
%for job in jobs:
diff -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 -r 1a868f109ed985b738c709422dbd6cccbc5cdd32 templates/webapps/reports/jobs_tool_per_month.mako
--- a/templates/webapps/reports/jobs_tool_per_month.mako
+++ b/templates/webapps/reports/jobs_tool_per_month.mako
@@ -7,15 +7,19 @@
<div class="toolForm"><div class="toolFormBody">
- <h3 align="center">Jobs per month for tool "${tool_id}"</h3>
- <h4 align="center">Click Total Jobs to see the tool's jobs for that month</h4>
+ <h4 align="center">Jobs per month for tool "${tool_id}"</h4>
+ <h5 align="center">Click Jobs to view details</h5><table align="center" width="60%" class="colored">
%if len( jobs ) == 0:
<tr><td colspan="2">There are no jobs for tool "${tool_id}"</td></tr>
%else:
<tr class="header"><td>Month</td>
- <td>Total Jobs</td>
+ %if is_user_jobs_only:
+ <td>User Jobs</td>
+ %else:
+ <td>User + Monitor Jobs</td>
+ %endif
</tr><% ctr = 0 %>
%for job in jobs:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/97d020901403/
Changeset: 97d020901403
Branch: stable
User: carlfeberhard
Date: 2013-09-09 17:20:40
Summary: Fix to dbkey select in library upload when 'unspecified' is not in dbkey list
Affected #: 1 file
diff -r 9fe9aa6ac504bc7703ba5d465e68170d79e89905 -r 97d0209014039d0f9aaa7f48a188708de1829786 templates/webapps/galaxy/library/common/common.mako
--- a/templates/webapps/galaxy/library/common/common.mako
+++ b/templates/webapps/galaxy/library/common/common.mako
@@ -275,8 +275,9 @@
# move unspecified to the first option and set as default if not last_used_build
#TODO: remove when we decide on a common dbkey selector widget
unspecified = ('unspecified (?)', '?')
- dbkeys.remove( unspecified )
- dbkeys.insert( 0, unspecified )
+ if unspecified in dbkeys:
+ dbkeys.remove( unspecified )
+ dbkeys.insert( 0, unspecified )
default_selected = last_used_build or '?'
%>
%for dbkey in dbkeys:
https://bitbucket.org/galaxy/galaxy-central/commits/b62b2b4daeac/
Changeset: b62b2b4daeac
User: carlfeberhard
Date: 2013-09-09 17:21:06
Summary: Merge stable
Affected #: 1 file
diff -r 354fb3e262bbf08cf5b77f70da1bb45ee947841c -r b62b2b4daeacc819016d6208078fa1fdd5b467c4 templates/webapps/galaxy/library/common/common.mako
--- a/templates/webapps/galaxy/library/common/common.mako
+++ b/templates/webapps/galaxy/library/common/common.mako
@@ -275,8 +275,9 @@
# move unspecified to the first option and set as default if not last_used_build
#TODO: remove when we decide on a common dbkey selector widget
unspecified = ('unspecified (?)', '?')
- dbkeys.remove( unspecified )
- dbkeys.insert( 0, unspecified )
+ if unspecified in dbkeys:
+ dbkeys.remove( unspecified )
+ dbkeys.insert( 0, unspecified )
default_selected = last_used_build or '?'
%>
%for dbkey in dbkeys:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Richard Burhans: Fixed initial installation of virtualenv
by commits-noreply@bitbucket.org 06 Sep '13
by commits-noreply@bitbucket.org 06 Sep '13
06 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/354fb3e262bb/
Changeset: 354fb3e262bb
User: Richard Burhans
Date: 2013-09-07 00:03:51
Summary: Fixed initial installation of virtualenv
Affected #: 1 file
diff -r 8f3110cc7851e67256a6d24d7e41ad65e5a507dd -r 354fb3e262bbf08cf5b77f70da1bb45ee947841c lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -155,12 +155,10 @@
if not os.path.exists( venv_dir ):
with make_tmp_dir() as work_dir:
downloaded_filename = VIRTUALENV_URL.rsplit('/', 1)[-1]
- downloaded_file_path = td_common_util.url_download( work_dir, downloaded_filename, VIRTUALENV_URL )
- if td_common_util.istar( downloaded_file_path ):
- td_common_util.extract_tar( downloaded_file_path, work_dir )
- dir = td_common_util.tar_extraction_directory( work_dir, downloaded_filename )
- else:
- log.error( "Failed to download virtualenv: Downloaded file '%s' is not a tar file", downloaded_filename )
+ try:
+ dir = td_common_util.url_download( work_dir, downloaded_filename, VIRTUALENV_URL )
+ except:
+ log.error( "Failed to download virtualenv: td_common_util.url_download( '%s', '%s', '%s' ) threw an exception", work_dir, downloaded_filename, VIRTUALENV_URL )
return False
full_path_to_dir = os.path.abspath( os.path.join( work_dir, dir ) )
shutil.move( full_path_to_dir, venv_dir )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Display command-line on dataset info page for admin users.
by commits-noreply@bitbucket.org 06 Sep '13
by commits-noreply@bitbucket.org 06 Sep '13
06 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/8f3110cc7851/
Changeset: 8f3110cc7851
User: dan
Date: 2013-09-06 23:09:58
Summary: Display command-line on dataset info page for admin users.
Affected #: 1 file
diff -r 1a5f25751066598601aef4ad34da4b3bea8deaca -r 8f3110cc7851e67256a6d24d7e41ad65e5a507dd templates/show_params.mako
--- a/templates/show_params.mako
+++ b/templates/show_params.mako
@@ -123,6 +123,9 @@
%if trans.user_is_admin() or trans.app.config.expose_dataset_path:
<tr><td>Full Path:</td><td>${hda.file_name | h}</td></tr>
%endif
+ %if job and job.command_line and trans.user_is_admin():
+ <tr><td>Job Command-Line:</td><td>${ job.command_line | h }</td></tr>
+ %endif
</table><br />
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Dave Bouvier: Fix tool shed API value mappers.
by commits-noreply@bitbucket.org 06 Sep '13
by commits-noreply@bitbucket.org 06 Sep '13
06 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/1a5f25751066/
Changeset: 1a5f25751066
User: Dave Bouvier
Date: 2013-09-06 23:06:18
Summary: Fix tool shed API value mappers.
Affected #: 3 files
diff -r f82d49deb6d05648d2ab114cff6d689b38806d65 -r 1a5f25751066598601aef4ad34da4b3bea8deaca lib/galaxy/webapps/tool_shed/api/repositories.py
--- a/lib/galaxy/webapps/tool_shed/api/repositories.py
+++ b/lib/galaxy/webapps/tool_shed/api/repositories.py
@@ -14,18 +14,6 @@
log = logging.getLogger( __name__ )
-def default_repository_value_mapper( trans, repository ):
- value_mapper={ 'id' : trans.security.encode_id( repository.id ),
- 'user_id' : trans.security.encode_id( repository.user_id ) }
- return value_mapper
-
-def default_repository_metadata_value_mapper( trans, repository_metadata ):
- value_mapper = { 'id' : trans.security.encode_id( repository_metadata.id ),
- 'repository_id' : trans.security.encode_id( repository_metadata.repository_id ) }
- if repository_metadata.time_last_tested:
- value_mapper[ 'time_last_tested' ] = time_ago( repository_metadata.time_last_tested )
- return value_mapper
-
class RepositoriesController( BaseAPIController ):
"""RESTful controller for interactions with repositories in the Tool Shed."""
@@ -108,12 +96,17 @@
]
}
"""
+ metadata_value_mapper = { 'id' : trans.security.encode_id,
+ 'repository_id' : trans.security.encode_id,
+ 'time_last_tested' : time_ago }
+ repository_value_mapper = { 'id' : trans.security.encode_id,
+ 'user_id' : trans.security.encode_id }
# Example URL: http://localhost:9009/api/repositories/get_repository_revision_install_info…
try:
# Get the repository information.
repository = suc.get_repository_by_name_and_owner( trans.app, name, owner )
encoded_repository_id = trans.security.encode_id( repository.id )
- repository_dict = repository.to_dict( view='element', value_mapper=default_repository_value_mapper( trans, repository ) )
+ repository_dict = repository.to_dict( view='element', value_mapper=repository_value_mapper )
repository_dict[ 'url' ] = web.url_for( controller='repositories',
action='show',
id=encoded_repository_id )
@@ -130,7 +123,7 @@
if repository_metadata:
encoded_repository_metadata_id = trans.security.encode_id( repository_metadata.id )
repository_metadata_dict = repository_metadata.to_dict( view='collection',
- value_mapper=default_repository_metadata_value_mapper( trans, repository_metadata ) )
+ value_mapper=metadata_value_mapper )
repository_metadata_dict[ 'url' ] = web.url_for( controller='repository_revisions',
action='show',
id=encoded_repository_metadata_id )
@@ -155,6 +148,8 @@
GET /api/repositories
Displays a collection (list) of repositories.
"""
+ value_mapper = { 'id' : trans.security.encode_id,
+ 'user_id' : trans.security.encode_id }
# Example URL: http://localhost:9009/api/repositories
repository_dicts = []
deleted = util.string_as_bool( deleted )
@@ -164,7 +159,7 @@
.order_by( trans.app.model.Repository.table.c.name ) \
.all()
for repository in query:
- repository_dict = repository.to_dict( view='collection', value_mapper=default_repository_value_mapper( trans, repository ) )
+ repository_dict = repository.to_dict( view='collection', value_mapper=value_mapper )
repository_dict[ 'url' ] = web.url_for( controller='repositories',
action='show',
id=trans.security.encode_id( repository.id ) )
@@ -184,10 +179,12 @@
:param id: the encoded id of the Repository object
"""
+ value_mapper = { 'id' : trans.security.encode_id,
+ 'user_id' : trans.security.encode_id }
# Example URL: http://localhost:9009/api/repositories/f9cad7b01a472135
try:
repository = suc.get_repository_in_tool_shed( trans, id )
- repository_dict = repository.to_dict( view='element', value_mapper=default_repository_value_mapper( trans, repository ) )
+ repository_dict = repository.to_dict( view='element', value_mapper=value_mapper )
repository_dict[ 'url' ] = web.url_for( controller='repositories',
action='show',
id=trans.security.encode_id( repository.id ) )
diff -r f82d49deb6d05648d2ab114cff6d689b38806d65 -r 1a5f25751066598601aef4ad34da4b3bea8deaca lib/galaxy/webapps/tool_shed/api/repository_revisions.py
--- a/lib/galaxy/webapps/tool_shed/api/repository_revisions.py
+++ b/lib/galaxy/webapps/tool_shed/api/repository_revisions.py
@@ -11,13 +11,6 @@
log = logging.getLogger( __name__ )
-def default_value_mapper( trans, repository_metadata ):
- value_mapper = { 'id' : trans.security.encode_id( repository_metadata.id ),
- 'repository_id' : trans.security.encode_id( repository_metadata.repository_id ) }
- if repository_metadata.time_last_tested:
- value_mapper[ 'time_last_tested' ] = time_ago( repository_metadata.time_last_tested )
- return value_mapper
-
class RepositoryRevisionsController( BaseAPIController ):
"""RESTful controller for interactions with tool shed repository revisions."""
@@ -79,6 +72,9 @@
GET /api/repository_revisions
Displays a collection (list) of repository revisions.
"""
+ value_mapper = { 'id' : trans.security.encode_id,
+ 'repository_id' : trans.security.encode_id,
+ 'time_last_tested' : time_ago }
# Example URL: http://localhost:9009/api/repository_revisions
repository_metadata_dicts = []
# Build up an anded clause list of filters.
@@ -128,7 +124,7 @@
.all()
for repository_metadata in query:
repository_metadata_dict = repository_metadata.to_dict( view='collection',
- value_mapper=default_value_mapper( trans, repository_metadata ) )
+ value_mapper=value_mapper )
repository_metadata_dict[ 'url' ] = web.url_for( controller='repository_revisions',
action='show',
id=trans.security.encode_id( repository_metadata.id ) )
@@ -148,10 +144,13 @@
:param id: the encoded id of the `RepositoryMetadata` object
"""
+ value_mapper = { 'id' : trans.security.encode_id,
+ 'repository_id' : trans.security.encode_id,
+ 'time_last_tested' : time_ago }
# Example URL: http://localhost:9009/api/repository_revisions/bb125606ff9ea620
try:
repository_metadata = metadata_util.get_repository_metadata_by_id( trans, id )
- repository_metadata_dict = repository_metadata.as_dict( value_mapper=default_value_mapper( trans, repository_metadata ) )
+ repository_metadata_dict = repository_metadata.to_dict( value_mapper=value_mapper )
repository_metadata_dict[ 'url' ] = web.url_for( controller='repository_revisions',
action='show',
id=trans.security.encode_id( repository_metadata.id ) )
@@ -168,6 +167,7 @@
PUT /api/repository_revisions/{encoded_repository_metadata_id}/{payload}
Updates the value of specified columns of the repository_metadata table based on the key / value pairs in payload.
"""
+ value_mapper = dict( id=trans.security.encode_id, repository_id=trans.security.encode_id, time_last_tested=time_ago )
repository_metadata_id = kwd.get( 'id', None )
try:
repository_metadata = metadata_util.get_repository_metadata_by_id( trans, repository_metadata_id )
@@ -188,7 +188,7 @@
log.error( message, exc_info=True )
trans.response.status = 500
return message
- repository_metadata_dict = repository_metadata.as_dict( value_mapper=default_value_mapper( trans, repository_metadata ) )
+ repository_metadata_dict = repository_metadata.to_dict( value_mapper=value_mapper )
repository_metadata_dict[ 'url' ] = web.url_for( controller='repository_revisions',
action='show',
id=trans.security.encode_id( repository_metadata.id ) )
diff -r f82d49deb6d05648d2ab114cff6d689b38806d65 -r 1a5f25751066598601aef4ad34da4b3bea8deaca lib/galaxy/webapps/tool_shed/model/__init__.py
--- a/lib/galaxy/webapps/tool_shed/model/__init__.py
+++ b/lib/galaxy/webapps/tool_shed/model/__init__.py
@@ -154,9 +154,6 @@
self.times_downloaded = times_downloaded
self.deprecated = deprecated
- def as_dict( self, value_mapper=None ):
- return self.to_dict( view='element', value_mapper=value_mapper )
-
def can_change_type( self, app ):
# Allow changing the type only if the repository has no contents, has never been installed, or has never been changed from
# the default type.
@@ -233,7 +230,7 @@
class RepositoryMetadata( object, Dictifiable ):
dict_collection_visible_keys = ( 'id', 'repository_id', 'changeset_revision', 'malicious', 'downloadable', 'has_repository_dependencies', 'includes_datatypes',
- 'includes_tools', 'includes_tool_dependencies', 'includes_tools_for_display_in_tool_panel', 'includes_workflows' )
+ 'includes_tools', 'includes_tool_dependencies', 'includes_tools_for_display_in_tool_panel', 'includes_workflows', 'time_last_tested' )
dict_element_visible_keys = ( 'id', 'repository_id', 'changeset_revision', 'malicious', 'downloadable', 'tools_functionally_correct', 'do_not_test',
'test_install_error', 'time_last_tested', 'tool_test_results', 'has_repository_dependencies', 'includes_datatypes',
'includes_tools', 'includes_tool_dependencies', 'includes_tools_for_display_in_tool_panel', 'includes_workflows' )
@@ -270,9 +267,6 @@
return True
return False
- def as_dict( self, value_mapper=None ):
- return self.to_dict( view='element', value_mapper=value_mapper )
-
class SkipToolTest( object, Dictifiable ):
dict_collection_visible_keys = ( 'id', 'repository_metadata_id', 'initial_changeset_revision' )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Dave Bouvier: Set the tool depepndency path for repository installation and testing.
by commits-noreply@bitbucket.org 06 Sep '13
by commits-noreply@bitbucket.org 06 Sep '13
06 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/f82d49deb6d0/
Changeset: f82d49deb6d0
User: Dave Bouvier
Date: 2013-09-06 22:53:07
Summary: Set the tool depepndency path for repository installation and testing.
Affected #: 1 file
diff -r f4a839856cfb06c01845cdd96f2893f4b70250bb -r f82d49deb6d05648d2ab114cff6d689b38806d65 test/install_and_test_tool_shed_repositories/functional_tests.py
--- a/test/install_and_test_tool_shed_repositories/functional_tests.py
+++ b/test/install_and_test_tool_shed_repositories/functional_tests.py
@@ -567,6 +567,7 @@
if tool_dependency_dir is None:
tool_dependency_dir = tempfile.mkdtemp( dir=galaxy_test_tmp_dir )
os.environ[ 'GALAXY_INSTALL_TEST_TOOL_DEPENDENCY_DIR' ] = tool_dependency_dir
+ os.environ[ 'GALAXY_TOOL_DEPENDENCY_DIR' ] = tool_dependency_dir
if 'GALAXY_INSTALL_TEST_DBURI' in os.environ:
database_connection = os.environ[ 'GALAXY_INSTALL_TEST_DBURI' ]
else:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Partial framework support for enhanced tool dependency definitions that enable installation of binaries into specified architectures. More coming soon...
by commits-noreply@bitbucket.org 06 Sep '13
by commits-noreply@bitbucket.org 06 Sep '13
06 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/f4a839856cfb/
Changeset: f4a839856cfb
User: greg
Date: 2013-09-06 15:38:12
Summary: Partial framework support for enhanced tool dependency definitions that enable installation of binaries into specified architectures. More coming soon...
Affected #: 8 files
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -252,7 +252,7 @@
tool_shed_repository.uninstalled = True
# Remove all installed tool dependencies, but don't touch any repository dependencies..
for tool_dependency in tool_shed_repository.installed_tool_dependencies:
- uninstalled, error_message = tool_dependency_util.remove_tool_dependency( trans, tool_dependency )
+ uninstalled, error_message = tool_dependency_util.remove_tool_dependency( trans.app, tool_dependency )
if error_message:
errors = '%s %s' % ( errors, error_message )
tool_shed_repository.deleted = True
@@ -1559,7 +1559,7 @@
if tool_dependency.can_uninstall:
tool_dependencies_for_uninstallation.append( tool_dependency )
for tool_dependency in tool_dependencies_for_uninstallation:
- uninstalled, error_message = tool_dependency_util.remove_tool_dependency( trans, tool_dependency )
+ uninstalled, error_message = tool_dependency_util.remove_tool_dependency( trans.app, tool_dependency )
if error_message:
errors = True
message = '%s %s' % ( message, error_message )
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -281,8 +281,6 @@
source_dir=os.path.join( action_dict[ 'source_directory' ] ),
destination_dir=os.path.join( action_dict[ 'destination_directory' ] ) )
elif action_type == 'move_file':
- # TODO: Remove this hack that resets current_dir so that the pre-compiled bwa binary can be found.
- # current_dir = '/Users/gvk/workspaces_2008/bwa/bwa-0.5.9'
td_common_util.move_file( current_dir=current_dir,
source=os.path.join( action_dict[ 'source' ] ),
destination_dir=os.path.join( action_dict[ 'destination' ] ) )
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -12,6 +12,7 @@
from tool_shed.util import encoding_util
from tool_shed.util import tool_dependency_util
from tool_shed.util import xml_util
+from tool_shed.galaxy_install.tool_dependencies import td_common_util
from galaxy.model.orm import and_
from galaxy.web import url_for
from galaxy.util import asbool
@@ -106,6 +107,92 @@
text = common_util.tool_shed_get( app, tool_shed_url, url )
return text
+def handle_complex_repository_dependency_for_package( app, elem, package_name, package_version, tool_shed_repository ):
+ tool_dependency = None
+ tool_shed = elem.attrib[ 'toolshed' ]
+ required_repository_name = elem.attrib[ 'name' ]
+ required_repository_owner = elem.attrib[ 'owner' ]
+ default_required_repository_changeset_revision = elem.attrib[ 'changeset_revision' ]
+ required_repository = get_tool_shed_repository_by_tool_shed_name_owner_changeset_revision( app,
+ tool_shed,
+ required_repository_name,
+ required_repository_owner,
+ default_required_repository_changeset_revision )
+ tmp_filename = None
+ if required_repository:
+ required_repository_changeset_revision = required_repository.installed_changeset_revision
+ # Define the installation directory for the required tool dependency package in the required repository.
+ required_repository_package_install_dir = \
+ get_tool_dependency_install_dir( app=app,
+ repository_name=required_repository_name,
+ repository_owner=required_repository_owner,
+ repository_changeset_revision=required_repository_changeset_revision,
+ tool_dependency_type='package',
+ tool_dependency_name=package_name,
+ tool_dependency_version=package_version )
+ # Define the this dependent repository's tool dependency installation directory that will contain the env.sh file with a path to the
+ # required repository's installed tool dependency package.
+ dependent_install_dir = get_tool_dependency_install_dir( app=app,
+ repository_name=tool_shed_repository.name,
+ repository_owner=tool_shed_repository.owner,
+ repository_changeset_revision=tool_shed_repository.installed_changeset_revision,
+ tool_dependency_type='package',
+ tool_dependency_name=package_name,
+ tool_dependency_version=package_version )
+ # Set this dependent repository's tool dependency env.sh file with a path to the required repository's installed tool dependency package.
+ # We can get everything we need from the discovered installed required_repository.
+ if required_repository.status in [ app.model.ToolShedRepository.installation_status.DEACTIVATED,
+ app.model.ToolShedRepository.installation_status.INSTALLED ]:
+ if not os.path.exists( required_repository_package_install_dir ):
+ print 'Missing required tool dependency directory %s' % str( required_repository_package_install_dir )
+ repo_files_dir = required_repository.repo_files_directory( app )
+ tool_dependencies_config = get_absolute_path_to_file_in_repository( repo_files_dir, 'tool_dependencies.xml' )
+ if tool_dependencies_config:
+ config_to_use = tool_dependencies_config
+ else:
+ message = "Unable to locate required tool_dependencies.xml file for revision %s of installed repository %s owned by %s." % \
+ ( str( required_repository.changeset_revision ), str( required_repository.name ), str( required_repository.owner ) )
+ raise Exception( message )
+ else:
+ # Make a call to the tool shed to get the changeset revision to which the current value of required_repository_changeset_revision
+ # should be updated if it's not current.
+ text = get_updated_changeset_revisions_from_tool_shed( app=app,
+ tool_shed_url=tool_shed,
+ name=required_repository_name,
+ owner=required_repository_owner,
+ changeset_revision=required_repository_changeset_revision )
+ if text:
+ updated_changeset_revisions = listify( text )
+ # The list of changeset revisions is in reverse order, so the newest will be first.
+ required_repository_changeset_revision = updated_changeset_revisions[ 0 ]
+ # Make a call to the tool shed to get the required repository's tool_dependencies.xml file.
+ tmp_filename = create_temporary_tool_dependencies_config( app,
+ tool_shed,
+ required_repository_name,
+ required_repository_owner,
+ required_repository_changeset_revision )
+ config_to_use = tmp_filename
+ tool_dependency, actions_dict = populate_actions_dict( app=app,
+ dependent_install_dir=dependent_install_dir,
+ required_install_dir=required_repository_package_install_dir,
+ tool_shed_repository=tool_shed_repository,
+ required_repository=required_repository,
+ package_name=package_name,
+ package_version=package_version,
+ tool_dependencies_config=config_to_use )
+ if tmp_filename:
+ try:
+ os.remove( tmp_filename )
+ except:
+ pass
+ # Install and build the package via fabric.
+ install_and_build_package_via_fabric( app, tool_dependency, actions_dict )
+ else:
+ message = "Unable to locate required tool shed repository named %s owned by %s with revision %s." % \
+ ( str( required_repository_name ), str( required_repository_owner ), str( default_required_repository_changeset_revision ) )
+ raise Exception( message )
+ return tool_dependency
+
def handle_set_environment_entry_for_package( app, install_dir, tool_shed_repository, package_name, package_version, elem, required_repository ):
"""
Populate a list of actions for creating an env.sh file for a dependent repository. The received elem is the <package> tag set associated
@@ -222,88 +309,9 @@
for package_elem in elem:
if package_elem.tag == 'repository':
# We have a complex repository dependency definition.
- tool_shed = package_elem.attrib[ 'toolshed' ]
- required_repository_name = package_elem.attrib[ 'name' ]
- required_repository_owner = package_elem.attrib[ 'owner' ]
- default_required_repository_changeset_revision = package_elem.attrib[ 'changeset_revision' ]
- required_repository = get_tool_shed_repository_by_tool_shed_name_owner_changeset_revision( app,
- tool_shed,
- required_repository_name,
- required_repository_owner,
- default_required_repository_changeset_revision )
- tmp_filename = None
- if required_repository:
- required_repository_changeset_revision = required_repository.installed_changeset_revision
- # Define the installation directory for the required tool dependency package in the required repository.
- required_repository_package_install_dir = \
- get_tool_dependency_install_dir( app=app,
- repository_name=required_repository_name,
- repository_owner=required_repository_owner,
- repository_changeset_revision=required_repository_changeset_revision,
- tool_dependency_type='package',
- tool_dependency_name=package_name,
- tool_dependency_version=package_version )
- # Define the this dependent repository's tool dependency installation directory that will contain the env.sh file with a path to the
- # required repository's installed tool dependency package.
- dependent_install_dir = get_tool_dependency_install_dir( app=app,
- repository_name=tool_shed_repository.name,
- repository_owner=tool_shed_repository.owner,
- repository_changeset_revision=tool_shed_repository.installed_changeset_revision,
- tool_dependency_type='package',
- tool_dependency_name=package_name,
- tool_dependency_version=package_version )
- # Set this dependent repository's tool dependency env.sh file with a path to the required repository's installed tool dependency package.
- # We can get everything we need from the discovered installed required_repository.
- if required_repository.status in [ app.model.ToolShedRepository.installation_status.DEACTIVATED,
- app.model.ToolShedRepository.installation_status.INSTALLED ]:
- if not os.path.exists( required_repository_package_install_dir ):
- print 'Missing required tool dependency directory %s' % str( required_repository_package_install_dir )
- repo_files_dir = required_repository.repo_files_directory( app )
- tool_dependencies_config = get_absolute_path_to_file_in_repository( repo_files_dir, 'tool_dependencies.xml' )
- if tool_dependencies_config:
- config_to_use = tool_dependencies_config
- else:
- message = "Unable to locate required tool_dependencies.xml file for revision %s of installed repository %s owned by %s." % \
- ( str( required_repository.changeset_revision ), str( required_repository.name ), str( required_repository.owner ) )
- raise Exception( message )
- else:
- # Make a call to the tool shed to get the changeset revision to which the current value of required_repository_changeset_revision
- # should be updated if it's not current.
- text = get_updated_changeset_revisions_from_tool_shed( app=app,
- tool_shed_url=tool_shed,
- name=required_repository_name,
- owner=required_repository_owner,
- changeset_revision=required_repository_changeset_revision )
- if text:
- updated_changeset_revisions = listify( text )
- # The list of changeset revisions is in reverse order, so the newest will be first.
- required_repository_changeset_revision = updated_changeset_revisions[ 0 ]
- # Make a call to the tool shed to get the required repository's tool_dependencies.xml file.
- tmp_filename = create_temporary_tool_dependencies_config( app,
- tool_shed,
- required_repository_name,
- required_repository_owner,
- required_repository_changeset_revision )
- config_to_use = tmp_filename
- tool_dependency, actions_dict = populate_actions_dict( app=app,
- dependent_install_dir=dependent_install_dir,
- required_install_dir=required_repository_package_install_dir,
- tool_shed_repository=tool_shed_repository,
- required_repository=required_repository,
- package_name=package_name,
- package_version=package_version,
- tool_dependencies_config=config_to_use )
- if tmp_filename:
- try:
- os.remove( tmp_filename )
- except:
- pass
- # Install and build the package via fabric.
- install_and_build_package_via_fabric( app, tool_dependency, actions_dict )
- else:
- message = "Unable to locate required tool shed repository named %s owned by %s with revision %s." % \
- ( str( required_repository_name ), str( required_repository_owner ), str( default_required_repository_changeset_revision ) )
- raise Exception( message )
+ rd_tool_dependency = handle_complex_repository_dependency_for_package( app, package_elem, package_name, package_version, tool_shed_repository )
+ if rd_tool_dependency and rd_tool_dependency.status == app.model.ToolDependency.installation_status.ERROR:
+ print "Error installing tool dependency for required repository: %s" % str( rd_tool_dependency.error_message )
elif package_elem.tag == 'install':
# <install version="1.0">
# Get the installation directory for tool dependencies that will be installed for the received tool_shed_repository.
@@ -333,114 +341,32 @@
type='package',
status=app.model.ToolDependency.installation_status.INSTALLING,
set_status=True )
- # Get the information that defines the current platform.
+ # Get the information about the current platform in case the tool dependency definition includes tag sets for installing
+ # compiled binaries.
platform_info_dict = tool_dependency_util.get_platform_info_dict()
if package_install_version == '1.0':
- # Handle tool dependency installation using a fabric method included in the Galaxy framework. The first thing we do
- # is check the installation architecture to see if we have a precompiled binary that works on the target system.
+ # Handle tool dependency installation using a fabric method included in the Galaxy framework.
+ actions_elem_tuples = td_common_util.parse_package_elem( package_elem,
+ platform_info_dict=platform_info_dict,
+ include_after_install_actions=True )
+ # At this point we have a list of <actions> elems that are either defined within an <actions_group> tag set with <actions>
+ # sub-elements that contains os and architecture attributes filtered by the platform into which the appropriate compiled
+ # binary will be installed, or not defined within an <actions_group> tag set and not filtered.
binary_installed = False
- actions_elem_tuples = []
- # Build a list of grouped and ungrouped <actions> tagsets to be processed in the order they are defined in the
- # tool_dependencies.xml file.
- for elem in package_elem:
- # Default to not treating actions as grouped.
- grouped = False
- # Skip any element that is not <actions> or <actions_group>. This will also skip comments and <readme> tags.
- if elem.tag not in [ 'actions', 'actions_group' ]:
- continue
- if elem.tag == 'actions':
- # We have an <actions> tag that should not be matched against a specific combination of architecture and operating system.
- grouped = False
- actions_elem_tuples.append( ( grouped, elem ) )
- else:
- # Record the number of <actions> elements, in order to filter out any <action> elements that precede <actions>
- # elements.
- actions_elem_count = len( elem.findall( 'actions' ) )
- # Record the number of <actions> elements that have architecture and os specified, in order to filter out any
- # platform-independent <actions> elements that come before platform-specific <actions> elements. This call to
- # elem.findall is filtered by tags that have both the os and architecture specified.
- # For more details, see http://docs.python.org/2/library/xml.etree.elementtree.html Section 19.7.2.1.
- platform_actions_element_count = len( elem.findall( 'actions[@architecture][@os]' ) )
- platform_actions_elements_processed = 0
- actions_elems_processed = 0
- # We have an actions_group element, and its child <actions> elements should therefore be compared with the current
- # operating system and processor architecture.
- grouped = True
- # The tagsets that will go into the actions_elem_list are those that install a precompiled binary if the
- # architecture and operating system match its defined attributes. If precompiled binary is not installed
- # the first <actions> tag following those that have the os and architecture attributes will be processed
- # in order to install and compile the source.
- actions_elem_list = []
- # The tagsets that will go into the after_install_actions list are <action> tags instead of <actions> tags. These
- # will only be processed if they are at the end of the <actions_group> tagset. See below for details.
- after_install_actions = []
- platform_independent_actions = []
- # Loop through the <actions_group> element and build the actions_elem_list and the after_install_actions list.
- for child_element in elem:
- if child_element.tag == 'actions':
- actions_elems_processed += 1
- system = child_element.get( 'os' )
- architecture = child_element.get( 'architecture' )
- # Skip <actions> tags that have only one of architecture or os specified, in order for the count in
- # platform_actions_elements_processed to remain accurate.
- if ( system and not architecture ) or ( architecture and not system ):
- log.debug( 'Error: Both architecture and os attributes must be specified in an <actions> tag.' )
- continue
- # Since we are inside an <actions_group> tagset, compare it with our current platform information and filter
- # the <actions> tagsets that don't match. Require both the os and architecture attributes to be defined in
- # order to find a match.
- if system and architecture:
- platform_actions_elements_processed += 1
- # If either the os or architecture do not match the platform, this <actions> tag will not be considered
- # a match. Skip it and proceed with checking the next one.
- if platform_info_dict[ 'os' ] != system or platform_info_dict[ 'architecture' ] != architecture:
- continue
- else:
- # <actions> tags without both os and architecture attributes are only allowed to be specified after
- # platform-specific <actions> tags. If we find a platform-independent <actions> tag before all
- # platform-specific <actions> tags have been processed, log a message stating this and skip to the
- # next <actions> tag.
- if platform_actions_elements_processed < platform_actions_element_count:
- message = 'Error: <actions> tags without os and architecture attributes are only allowed '
- message += 'after <actions> tags with os and architecture attributes specified. Skipping '
- message += 'current <actions> tag.'
- log.debug( message )
- continue
- # If we reach this point, it means one of two things: 1) The system and architecture attributes are not
- # defined in this <actions> tag, or 2) The system and architecture attributes are defined, and they are
- # an exact match for the current platform. Append the child element to the list of elements to process.
- actions_elem_list.append( child_element )
- elif child_element.tag == 'action':
- # Any <action> tags within an <actions_group> tagset must come after all <actions> tags.
- if actions_elems_processed == actions_elem_count:
- # If all <actions> elements have been processed, then this <action> element can be appended to the
- # list of actions to execute within this group.
- after_install_actions.append( child_element )
- else:
- # If any <actions> elements remain to be processed, then log a message stating that <action>
- # elements are not allowed to precede any <actions> elements within an <actions_group> tagset.
- message = 'Error: <action> tags are only allowed at the end of an <actions_group> '
- message += 'tagset, after all <actions> tags. '
- message += 'Skipping <%s> element with type %s.' % ( child_element.tag, child_element.get( 'type' ) )
- log.debug( message )
- continue
- if after_install_actions:
- actions_elem_list.extend( after_install_actions )
- actions_elem_tuples.append( ( grouped, actions_elem_list ) )
- # At this point we have a list of <actions> elems that are either defined within an <actions_group> tagset, and filtered by
- # the current platform, or not defined within an <actions_group> tagset, and not filtered.
- for grouped, actions_elems in actions_elem_tuples:
- if grouped:
- # Platform matching is only performed inside <actions_group> tagsets, os and architecture attributes are otherwise ignored.
+ for in_actions_group, actions_elems in actions_elem_tuples:
+ if in_actions_group:
+ # Platform matching is only performed inside <actions_group> tag sets, os and architecture attributes are otherwise
+ # ignored.
for actions_elem in actions_elems:
system = actions_elem.get( 'os' )
architecture = actions_elem.get( 'architecture' )
- # If this <actions> element has the os and architecture attributes defined, then we only want to process
- # until a successful installation is achieved.
+ # If this <actions> element has the os and architecture attributes defined, then we only want to process until a
+ # successful installation is achieved.
if system and architecture:
- # If an <actions> tag has been defined that matches our current platform, and the recipe specified
- # within that <actions> tag has been successfully processed, skip any remaining platform-specific
- # <actions> tags.
+ # If an <actions> tag has been defined that matches our current platform, and the recipe specified within
+ # that <actions> tag has been successfully processed, skip any remaining platform-specific <actions> tags.
+ # We cannot break out of the look here because there may be <action> tags at the end of the <actions_group>
+ # tag set that must be processed.
if binary_installed:
continue
# No platform-specific <actions> recipe has yet resulted in a successful installation.
@@ -451,33 +377,32 @@
actions_elem=actions_elem,
action_elem=None )
sa_session.refresh( tool_dependency )
- if tool_dependency.status != app.model.ToolDependency.installation_status.ERROR:
+ if tool_dependency.status == app.model.ToolDependency.installation_status.INSTALLED:
# If an <actions> tag was found that matches the current platform, and the install_via_fabric method
# did not result in an error state, set binary_installed to True in order to skip any remaining
# platform-specific <actions> tags.
- if not binary_installed:
- binary_installed = True
+ binary_installed = True
else:
- # Otherwise, move on to the next matching <actions> tag, or any defined <actions> tags that do not
- # contain platform-dependent recipes.
- if binary_installed:
- binary_installed = False
- print 'Encountered an error downloading binary for %s version %s: %s' % \
- ( package_name, package_version, tool_dependency.error_message )
+ # Process the next matching <actions> tag, or any defined <actions> tags that do not contain platform
+ # dependent recipes.
+ print 'Error downloading binary for %s version %s: %s' % \
+ ( package_name, package_version, tool_dependency.error_message )
else:
# If no <actions> tags have been defined that match our current platform, or none of the matching
# <actions> tags resulted in a successful tool dependency status, proceed with one and only one
# <actions> tag that is not defined to be platform-specific.
if not binary_installed:
- log.debug( 'Platform-specific recipe failed or not found. Proceeding with platform-independent install recipe.' )
+ print 'Binary installation did not occur, so proceeding with install and compile recipe.'
+ # Make sure to reset for installation if attempt at binary installation resulted in an error.
+ if tool_dependency.status != app.model.ToolDependency.installation_status.NEW:
+ removed, error_message = tool_dependency_util.remove_tool_dependency( app, tool_dependency )
install_via_fabric( app,
tool_dependency,
install_dir,
package_name=package_name,
actions_elem=actions_elem,
action_elem=None )
- break
- # Perform any final actions that have been defined within the actions_group tagset, but outside of
+ # Perform any final actions that have been defined within the actions_group tag set, but outside of
# an <actions> tag, such as a set_environment entry, or a download_file or download_by_url command to
# retrieve extra data for this tool dependency. Only do this if the tool dependency is not in an error
# state, otherwise skip this action.
@@ -490,7 +415,7 @@
action_elem=actions_elem )
else:
# <actions> tags outside of an <actions_group> tag shall not check os or architecture, and if the attributes are
- # defined, they will be ignored. All <actions> tags outside of an <actions_group> tagset shall always be processed.
+ # defined, they will be ignored. All <actions> tags outside of an <actions_group> tag set shall always be processed.
# This is the default and original behavior of the install_package method.
install_via_fabric( app,
tool_dependency,
@@ -519,12 +444,12 @@
def install_via_fabric( app, tool_dependency, install_dir, package_name=None, proprietary_fabfile_path=None, actions_elem=None, action_elem=None, **kwd ):
"""Parse a tool_dependency.xml file's <actions> tag set to gather information for the installation via fabric."""
- sa_session = app.model.context.current
def evaluate_template( text ):
""" Substitute variables defined in XML blocks from dependencies file."""
return Template( text ).safe_substitute( td_common_util.get_env_var_values( install_dir ) )
+ sa_session = app.model.context.current
if not os.path.exists( install_dir ):
os.makedirs( install_dir )
actions_dict = dict( install_dir=install_dir )
@@ -535,7 +460,7 @@
env_var_dicts = []
if actions_elem is not None:
elems = actions_elem
- if elems.get( 'architecture' ) is not None:
+ if elems.get( 'os' ) is not None and elems.get( 'architecture' ) is not None:
is_binary_download = True
else:
is_binary_download = False
@@ -750,7 +675,6 @@
new_value = new_value.split( ';' )[ 0 ]
return new_value
-
def populate_actions_dict( app, dependent_install_dir, required_install_dir, tool_shed_repository, required_repository, package_name, package_version, tool_dependencies_config ):
"""
Populate an actions dictionary that can be sent to fabric_util.install_and_build_package. This method handles the scenario where a tool_dependencies.xml
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
@@ -239,6 +239,105 @@
os.makedirs( destination_directory )
shutil.move( source_file, destination_directory )
+def parse_package_elem( package_elem, platform_info_dict=None, include_after_install_actions=True ):
+ """
+ Parse a <package> element within a tool dependency definition and return a list of action tuples. This method is called when setting
+ metadata on a repository that includes a tool_dependencies.xml file or when installing a repository that includes a tool_dependencies.xml
+ file. If installing, platform_info_dict must be a valid dictionary and include_after_install_actions must be True.
+ """
+ # The actions_elem_tuples list contains <actions> tag sets (possibly inside of an <actions_group> tag set) to be processed in the order
+ # they are defined in the tool_dependencies.xml file.
+ actions_elem_tuples = []
+ # The tag sets that will go into the actions_elem_list are those that install a compiled binary if the architecture and operating system
+ # match it's defined attributes. If compiled binary is not installed, the first <actions> tag set [following those that have the os and
+ # architecture attributes] that does not have os or architecture attributes will be processed. This tag set must contain the recipe for
+ # downloading and compiling source.
+ actions_elem_list = []
+ for elem in package_elem:
+ if elem.tag == 'actions':
+ # We have an <actions> tag that should not be matched against a specific combination of architecture and operating system.
+ in_actions_group = False
+ actions_elem_tuples.append( ( in_actions_group, elem ) )
+ elif elem.tag == 'actions_group':
+ # We have an actions_group element, and its child <actions> elements should therefore be compared with the current operating system
+ # and processor architecture.
+ in_actions_group = True
+ # Record the number of <actions> elements so we can filter out any <action> elements that precede <actions> elements.
+ actions_elem_count = len( elem.findall( 'actions' ) )
+ # Record the number of <actions> elements that have architecture and os specified, in order to filter out any platform-independent
+ # <actions> elements that come before platform-specific <actions> elements. This call to elem.findall is filtered by tags that have
+ # both the os and architecture specified. For more details, see http://docs.python.org/2/library/xml.etree.elementtree.html Section
+ # 19.7.2.1.
+ platform_actions_element_count = len( elem.findall( 'actions[@architecture][@os]' ) )
+ platform_actions_elements_processed = 0
+ actions_elems_processed = 0
+ # The tag sets that will go into the after_install_actions list are <action> tags instead of <actions> tags. These will be processed
+ # only if they are at the very end of the <actions_group> tag set (after all <actions> tag sets). See below for details.
+ after_install_actions = []
+ # Inspect the <actions_group> element and build the actions_elem_list and the after_install_actions list.
+ for child_element in elem:
+ if child_element.tag == 'actions':
+ actions_elems_processed += 1
+ system = child_element.get( 'os' )
+ architecture = child_element.get( 'architecture' )
+ # Skip <actions> tags that have only one of architecture or os specified, in order for the count in
+ # platform_actions_elements_processed to remain accurate.
+ if ( system and not architecture ) or ( architecture and not system ):
+ log.debug( 'Error: Both architecture and os attributes must be specified in an <actions> tag.' )
+ continue
+ # Since we are inside an <actions_group> tag set, compare it with our current platform information and filter the <actions>
+ # tag sets that don't match. Require both the os and architecture attributes to be defined in order to find a match.
+ if system and architecture:
+ platform_actions_elements_processed += 1
+ # If either the os or architecture do not match the platform, this <actions> tag will not be considered a match. Skip
+ # it and proceed with checking the next one.
+ if platform_info_dict:
+ if platform_info_dict[ 'os' ] != system or platform_info_dict[ 'architecture' ] != architecture:
+ continue
+ else:
+ # We must not be installing a repository into Galaxy, so determining if we can install a binary is not necessary.
+ continue
+ else:
+ # <actions> tags without both os and architecture attributes are only allowed to be specified after platform-specific
+ # <actions> tags. If we find a platform-independent <actions> tag before all platform-specific <actions> tags have been
+ # processed.
+ if platform_actions_elements_processed < platform_actions_element_count:
+ message = 'Error: <actions> tags without os and architecture attributes are only allowed after all <actions> tags with '
+ message += 'os and architecture attributes have been defined. Skipping the <actions> tag set with no os or architecture '
+ message += 'attributes that has been defined between two <actions> tag sets that have these attributes defined. '
+ log.debug( message )
+ continue
+ # If we reach this point, it means one of two things: 1) The system and architecture attributes are not defined in this
+ # <actions> tag, or 2) The system and architecture attributes are defined, and they are an exact match for the current
+ # platform. Append the child element to the list of elements to process.
+ actions_elem_list.append( child_element )
+ elif child_element.tag == 'action':
+ # Any <action> tags within an <actions_group> tag set must come after all <actions> tags.
+ if actions_elems_processed == actions_elem_count:
+ # If all <actions> elements have been processed, then this <action> element can be appended to the list of actions to
+ # execute within this group.
+ after_install_actions.append( child_element )
+ else:
+ # If any <actions> elements remain to be processed, then log a message stating that <action> elements are not allowed
+ # to precede any <actions> elements within an <actions_group> tag set.
+ message = 'Error: <action> tags are only allowed at the end of an <actions_group> tag set after all <actions> tags. '
+ message += 'Skipping <%s> element with type %s.' % ( child_element.tag, child_element.get( 'type' ) )
+ log.debug( message )
+ continue
+ if platform_info_dict is None and not include_after_install_actions:
+ # We must be setting metadata on a repository.
+ actions_elem_tuples.append( ( in_actions_group, actions_elem_list[ 0 ] ) )
+ elif platform_info_dict is not None and include_after_install_actions:
+ # We must be installing a repository.
+ if after_install_actions:
+ actions_elem_list.extend( after_install_actions )
+ actions_elem_tuples.append( ( in_actions_group, actions_elem_list ) )
+ else:
+ # Skip any element that is not <actions> or <actions_group> - this will skip comments, <repository> tags and <readme> tags.
+ in_actions_group = False
+ continue
+ return actions_elem_tuples
+
def tar_extraction_directory( file_path, file_name ):
"""Try to return the correct extraction directory."""
file_name = file_name.strip()
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/tool_shed/util/commit_util.py
--- a/lib/tool_shed/util/commit_util.py
+++ b/lib/tool_shed/util/commit_util.py
@@ -133,6 +133,24 @@
bzipped_file.close()
shutil.move( uncompressed, uploaded_file_name )
+def handle_complex_repository_dependency_elem( trans, elem, sub_elem_index, sub_elem, sub_elem_altered, altered, unpopulate=False ):
+ """
+ Populate or unpopulate the toolshed and changeset_revision attributes of a <repository> tag that defines a complex repository
+ dependency.
+ """
+ # The received sub_elem looks something like the following:
+ # <repository name="package_eigen_2_0" owner="test" prior_installation_required="True" />
+ revised, repository_elem, error_message = handle_repository_dependency_elem( trans, sub_elem, unpopulate=unpopulate )
+ if error_message:
+ exception_message = 'The tool_dependencies.xml file contains an invalid <repository> tag. %s' % error_message
+ raise Exception( exception_message )
+ if revised:
+ elem[ sub_elem_index ] = repository_elem
+ sub_elem_altered = True
+ if not altered:
+ altered = True
+ return altered, sub_elem_altered, elem
+
def handle_directory_changes( trans, repository, full_path, filenames_in_archive, remove_repo_files_not_in_tar, new_repo_alert, commit_message,
undesirable_dirs_removed, undesirable_files_removed ):
repo_dir = repository.repo_path( trans.app )
@@ -300,7 +318,30 @@
error_message = 'Unable to locate repository with name %s and owner %s. ' % ( str( name ), str( owner ) )
return revised, elem, error_message
+def handle_set_environment_for_install( trans, package_altered, altered, actions_elem, action_index, action_elem, unpopulate=False ):
+ # <action type="set_environment_for_install">
+ # <repository name="package_eigen_2_0" owner="test" changeset_revision="09eb05087cd0">
+ # <package name="eigen" version="2.0.17" />
+ # </repository>
+ # </action>
+ for repo_index, repo_elem in enumerate( action_elem ):
+ revised, repository_elem, error_message = handle_repository_dependency_elem( trans, repo_elem, unpopulate=unpopulate )
+ if error_message:
+ exception_message = 'The tool_dependencies.xml file contains an invalid <repository> tag. %s' % error_message
+ raise Exception( exception_message )
+ if revised:
+ action_elem[ repo_index ] = repository_elem
+ package_altered = True
+ if not altered:
+ altered = True
+ if package_altered:
+ actions_elem[ action_index ] = action_elem
+ return package_altered, altered, actions_elem
+
def handle_tool_dependencies_definition( trans, tool_dependencies_config, unpopulate=False ):
+ """
+ Populate or unpopulate the tooshed and changeset_revision attributes of each <repository> tag defined within a tool_dependencies.xml file.
+ """
altered = False
# Make sure we're looking at a valid tool_dependencies.xml file.
tree, error_message = xml_util.parse_xml( tool_dependencies_config )
@@ -308,51 +349,88 @@
return False, None
root = tree.getroot()
if root.tag == 'tool_dependency':
+ package_altered = False
for root_index, root_elem in enumerate( root ):
# <package name="eigen" version="2.0.17">
+ package_altered = False
if root_elem.tag == 'package':
- package_altered = False
for package_index, package_elem in enumerate( root_elem ):
if package_elem.tag == 'repository':
- # <repository name="package_eigen_2_0" owner="test" changeset_revision="09eb05087cd0" prior_installation_required="True" />
- revised, repository_elem, error_message = handle_repository_dependency_elem( trans, package_elem, unpopulate=unpopulate )
- if error_message:
- exception_message = 'The tool_dependencies.xml file contains an invalid <repository> tag. %s' % error_message
- raise Exception( exception_message )
- if revised:
- root_elem[ package_index ] = repository_elem
- package_altered = True
- if not altered:
- altered = True
+ # We have a complex repository dependency.
+ altered, package_altered, root_elem = handle_complex_repository_dependency_elem( trans,
+ root_elem,
+ package_index,
+ package_elem,
+ package_altered,
+ altered,
+ unpopulate=unpopulate )
elif package_elem.tag == 'install':
# <install version="1.0">
for actions_index, actions_elem in enumerate( package_elem ):
- for action_index, action_elem in enumerate( actions_elem ):
- action_type = action_elem.get( 'type' )
- if action_type == 'set_environment_for_install':
- # <action type="set_environment_for_install">
- # <repository name="package_eigen_2_0" owner="test" changeset_revision="09eb05087cd0">
- # <package name="eigen" version="2.0.17" />
- # </repository>
- # </action>
- for repo_index, repo_elem in enumerate( action_elem ):
- revised, repository_elem, error_message = handle_repository_dependency_elem( trans, repo_elem, unpopulate=unpopulate )
- if error_message:
- exception_message = 'The tool_dependencies.xml file contains an invalid <repository> tag. %s' % error_message
- raise Exception( exception_message )
- if revised:
- action_elem[ repo_index ] = repository_elem
- package_altered = True
- if not altered:
- altered = True
- if package_altered:
- actions_elem[ action_index ] = action_elem
+ if actions_elem.tag == 'actions_group':
+ # Inspect all entries in the <actions_group> tag set, skipping <actions> tag sets that define os and architecture
+ # attributes. We want to inspect only the last <actions> tag set contained within the <actions_group> tag set to
+ # see if a complex repository dependency is defined.
+ for actions_group_index, actions_group_elem in enumerate( actions_elem ):
+ if actions_group_elem.tag == 'actions':
+ # Skip all actions tags that include os or architecture attributes.
+ system = actions_group_elem.get( 'os' )
+ architecture = actions_group_elem.get( 'architecture' )
+ if system or architecture:
+ continue
+ # ...
+ # <actions>
+ # <package name="libgtextutils" version="0.6">
+ # <repository name="package_libgtextutils_0_6" owner="test" prior_installation_required="True" />
+ # </package>
+ # ...
+ for last_actions_index, last_actions_elem in enumerate( actions_group_elem ):
+ last_actions_package_altered = False
+ if last_actions_elem.tag == 'package':
+ for last_actions_elem_package_index, last_actions_elem_package_elem in enumerate( last_actions_elem ):
+ if last_actions_elem_package_elem.tag == 'repository':
+ # We have a complex repository dependency.
+ altered, last_actions_package_altered, last_actions_elem = \
+ handle_complex_repository_dependency_elem( trans,
+ last_actions_elem,
+ last_actions_elem_package_index,
+ last_actions_elem_package_elem,
+ last_actions_package_altered,
+ altered,
+ unpopulate=unpopulate )
+ if last_actions_package_altered:
+ last_actions_elem[ last_actions_elem_package_index ] = last_actions_elem_package_elem
+ actions_group_elem[ last_actions_index ] = last_actions_elem
+ else:
+ last_actions_elem_action_type = last_actions_elem.get( 'type' )
+ if last_actions_elem_action_type == 'set_environment_for_install':
+ last_actions_package_altered, altered, last_actions_elem = \
+ handle_set_environment_for_install( trans,
+ last_actions_package_altered,
+ altered,
+ actions_group_elem,
+ last_actions_index,
+ last_actions_elem,
+ unpopulate=unpopulate )
+ elif actions_elem.tag == 'actions':
+ # We are not in an <actions_group> tag set, so we must be in an <actions> tag set.
+ for action_index, action_elem in enumerate( actions_elem ):
+
+ action_type = action_elem.get( 'type' )
+ if action_type == 'set_environment_for_install':
+ package_altered, altered, actions_elem = handle_set_environment_for_install( trans,
+ package_altered,
+ altered,
+ actions_elem,
+ action_index,
+ action_elem,
+ unpopulate=unpopulate )
if package_altered:
package_elem[ actions_index ] = actions_elem
- if package_altered:
- root_elem[ package_index ] = package_elem
- if package_altered:
- root[ root_index ] = root_elem
+ if package_altered:
+ root_elem[ package_index ] = package_elem
+ if package_altered:
+ root[ root_index ] = root_elem
return altered, root
return False, None
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/tool_shed/util/common_install_util.py
--- a/lib/tool_shed/util/common_install_util.py
+++ b/lib/tool_shed/util/common_install_util.py
@@ -242,9 +242,8 @@
def get_installed_and_missing_tool_dependencies_for_new_install( trans, all_tool_dependencies ):
"""Return the lists of installed tool dependencies and missing tool dependencies for a set of repositories being installed into Galaxy."""
- # FIXME: this method currently populates and returns only missing tool dependencies since tool dependencies defined for complex repository dependency
- # relationships is not currently supported. This method should be enhanced to search for installed tool dependencies defined as complex repository
- # dependency relationships when that feature is implemented.
+ # FIXME: confirm that this method currently populates and returns only missing tool dependencies. If so, this method should be enhanced to search for
+ # installed tool dependencies defined as complex repository dependency relationships.
if all_tool_dependencies:
tool_dependencies = {}
missing_tool_dependencies = {}
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/tool_shed/util/metadata_util.py
--- a/lib/tool_shed/util/metadata_util.py
+++ b/lib/tool_shed/util/metadata_util.py
@@ -751,6 +751,32 @@
# dependency definition will be set as invalid. This is currently the only case where a tool dependency definition is
# considered invalid.
repository_dependency_tup, repository_dependency_is_valid, error_message = handle_repository_elem( app=app, repository_elem=sub_elem )
+ elif sub_elem.tag == 'install':
+ package_install_version = sub_elem.get( 'version', '1.0' )
+ if package_install_version == '1.0':
+ # Complex repository dependencies can be defined within the last <actions> tag set contained in an <actions_group> tag set.
+ # Comments, <repository> tag sets and <readme> tag sets will be skipped in td_common_util.parse_package_elem().
+ actions_elem_tuples = td_common_util.parse_package_elem( sub_elem, platform_info_dict=None, include_after_install_actions=False )
+ if actions_elem_tuples:
+ # We now have a list of a single tuple that looks something like: [(True, <Element 'actions' at 0x104017850>)]
+ actions_elem_tuple = actions_elem_tuples[ 0 ]
+ in_actions_group, actions_elem = actions_elem_tuple
+ if in_actions_group:
+ # Since we're inside an <actions_group> tag set, inspect the actions_elem to see if a complex repository dependency
+ # is defined.
+ for action_elem in actions_elem:
+ if action_elem.tag == 'package':
+ # <package name="libgtextutils" version="0.6">
+ # <repository name="package_libgtextutils_0_6" owner="test" prior_installation_required="True" />
+ # </package>
+ ae_package_name = action_elem.get( 'name', None )
+ ae_package_version = action_elem.get( 'version', None )
+ if ae_package_name and ae_package_version:
+ for sub_action_elem in action_elem:
+ if sub_action_elem.tag == 'repository':
+ # We have a complex repository dependency.
+ repository_dependency_tup, repository_dependency_is_valid, error_message = \
+ handle_repository_elem( app=app, repository_elem=sub_action_elem )
if requirements_dict:
dependency_key = '%s/%s' % ( package_name, package_version )
if repository_dependency_is_valid:
diff -r 53f51406718dbe21dc1deec10749e9dadd0ee77c -r f4a839856cfb06c01845cdd96f2893f4b70250bb lib/tool_shed/util/tool_dependency_util.py
--- a/lib/tool_shed/util/tool_dependency_util.py
+++ b/lib/tool_shed/util/tool_dependency_util.py
@@ -365,14 +365,15 @@
missing_tool_dependencies = repository_missing_tool_dependencies
return installed_tool_dependencies, missing_tool_dependencies
-def remove_tool_dependency( trans, tool_dependency ):
- dependency_install_dir = tool_dependency.installation_directory( trans.app )
+def remove_tool_dependency( app, tool_dependency ):
+ sa_session = app.model.context.current
+ dependency_install_dir = tool_dependency.installation_directory( app )
removed, error_message = remove_tool_dependency_installation_directory( dependency_install_dir )
if removed:
- tool_dependency.status = trans.model.ToolDependency.installation_status.UNINSTALLED
+ tool_dependency.status = app.model.ToolDependency.installation_status.UNINSTALLED
tool_dependency.error_message = None
- trans.sa_session.add( tool_dependency )
- trans.sa_session.flush()
+ sa_session.add( tool_dependency )
+ sa_session.flush()
return removed, error_message
def remove_tool_dependency_installation_directory( dependency_install_dir ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Dave Bouvier: Display the clone URL on an empty repository's view and manage pages for any user with write access.
by commits-noreply@bitbucket.org 05 Sep '13
by commits-noreply@bitbucket.org 05 Sep '13
05 Sep '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/53f51406718d/
Changeset: 53f51406718d
User: Dave Bouvier
Date: 2013-09-05 23:04:41
Summary: Display the clone URL on an empty repository's view and manage pages for any user with write access.
Affected #: 2 files
diff -r ea75f5629c38a529cf70c6578a74dac7a8073880 -r 53f51406718dbe21dc1deec10749e9dadd0ee77c templates/webapps/tool_shed/repository/manage_repository.mako
--- a/templates/webapps/tool_shed/repository/manage_repository.mako
+++ b/templates/webapps/tool_shed/repository/manage_repository.mako
@@ -133,7 +133,7 @@
<label>${sharable_link_label}</label>
${render_sharable_str( repository, changeset_revision=sharable_link_changeset_revision )}
</div>
- %if can_download:
+ %if can_download or can_push:
<div class="form-row"><label>Clone this repository:</label>
${render_clone_str( repository )}
diff -r ea75f5629c38a529cf70c6578a74dac7a8073880 -r 53f51406718dbe21dc1deec10749e9dadd0ee77c templates/webapps/tool_shed/repository/view_repository.mako
--- a/templates/webapps/tool_shed/repository/view_repository.mako
+++ b/templates/webapps/tool_shed/repository/view_repository.mako
@@ -86,7 +86,7 @@
<label>${sharable_link_label}</label>
${render_sharable_str( repository, changeset_revision=sharable_link_changeset_revision )}
</div>
- %if can_download:
+ %if can_download or can_push:
<div class="form-row"><label>Clone this repository:</label>
${render_clone_str( repository )}
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0