galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
July 2014
- 1 participants
- 146 discussions
commit/galaxy-central: greg: Eliminate the use of trans in the tool shed's tool_dependency_util.py module.
by commits-noreply@bitbucket.org 17 Jul '14
by commits-noreply@bitbucket.org 17 Jul '14
17 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/7ded35475477/
Changeset: 7ded35475477
User: greg
Date: 2014-07-17 21:49:48
Summary: Eliminate the use of trans in the tool shed's tool_dependency_util.py module.
Affected #: 2 files
diff -r 027d160ca90a264d1208184488479ec68e999b5d -r 7ded3547547731daa7d09b47bf601ce2f2209513 lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -169,7 +169,7 @@
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
tool_dependency_ids = tool_dependency_util.get_tool_dependency_ids( as_string=False, **kwd )
- tool_dependency = tool_dependency_util.get_tool_dependency( trans, tool_dependency_ids[ 0 ] )
+ tool_dependency = tool_dependency_util.get_tool_dependency( trans.app, tool_dependency_ids[ 0 ] )
if tool_dependency.in_error_state:
message = "This tool dependency is not installed correctly (see the <b>Tool dependency installation error</b> below). "
message += "Choose <b>Uninstall this tool dependency</b> from the <b>Repository Actions</b> menu, correct problems "
@@ -806,7 +806,7 @@
tool_dependency_ids = tool_dependency_util.get_tool_dependency_ids( as_string=False, **kwd )
if tool_dependency_ids:
# We need a tool_shed_repository, so get it from one of the tool_dependencies.
- tool_dependency = tool_dependency_util.get_tool_dependency( trans, tool_dependency_ids[ 0 ] )
+ tool_dependency = tool_dependency_util.get_tool_dependency( trans.app, tool_dependency_ids[ 0 ] )
tool_shed_repository = tool_dependency.tool_shed_repository
else:
# The user must be on the manage_repository_tool_dependencies page and clicked the button to either install or uninstall a
@@ -830,7 +830,7 @@
elif operation == 'uninstall':
tool_dependencies_for_uninstallation = []
for tool_dependency_id in tool_dependency_ids:
- tool_dependency = tool_dependency_util.get_tool_dependency( trans, tool_dependency_id )
+ tool_dependency = tool_dependency_util.get_tool_dependency( trans.app, tool_dependency_id )
if tool_dependency.status in [ trans.install_model.ToolDependency.installation_status.INSTALLED,
trans.install_model.ToolDependency.installation_status.ERROR ]:
tool_dependencies_for_uninstallation.append( tool_dependency )
@@ -845,7 +845,7 @@
if trans.app.config.tool_dependency_dir:
tool_dependencies_for_installation = []
for tool_dependency_id in tool_dependency_ids:
- tool_dependency = tool_dependency_util.get_tool_dependency( trans, tool_dependency_id )
+ tool_dependency = tool_dependency_util.get_tool_dependency( trans.app, tool_dependency_id )
if tool_dependency.status in [ trans.install_model.ToolDependency.installation_status.NEVER_INSTALLED,
trans.install_model.ToolDependency.installation_status.UNINSTALLED ]:
tool_dependencies_for_installation.append( tool_dependency )
@@ -889,7 +889,7 @@
repository_id = kwd.get( 'repository_id', None )
if tool_dependency_ids:
# We need a tool_shed_repository, so get it from one of the tool_dependencies.
- tool_dependency = tool_dependency_util.get_tool_dependency( trans, tool_dependency_ids[ 0 ] )
+ tool_dependency = tool_dependency_util.get_tool_dependency( trans.app, tool_dependency_ids[ 0 ] )
tool_shed_repository = tool_dependency.tool_shed_repository
else:
# The user must be on the manage_repository_tool_dependencies page and clicked the button to either install or uninstall a
@@ -914,7 +914,7 @@
if trans.app.config.tool_dependency_dir:
tool_dependencies_for_installation = []
for tool_dependency_id in tool_dependency_ids:
- tool_dependency = tool_dependency_util.get_tool_dependency( trans, tool_dependency_id )
+ tool_dependency = tool_dependency_util.get_tool_dependency( trans.app, tool_dependency_id )
if tool_dependency.status in [ trans.install_model.ToolDependency.installation_status.NEVER_INSTALLED,
trans.install_model.ToolDependency.installation_status.UNINSTALLED ]:
tool_dependencies_for_installation.append( tool_dependency )
@@ -1849,7 +1849,7 @@
tool_dependency_ids = util.listify( kwd.get( 'id', None ) )
tool_dependencies = []
for tool_dependency_id in tool_dependency_ids:
- tool_dependency = tool_dependency_util.get_tool_dependency( trans, tool_dependency_id )
+ tool_dependency = tool_dependency_util.get_tool_dependency( trans.app, tool_dependency_id )
tool_dependencies.append( tool_dependency )
tool_shed_repository = tool_dependencies[ 0 ].tool_shed_repository
if kwd.get( 'uninstall_tool_dependencies_button', False ):
diff -r 027d160ca90a264d1208184488479ec68e999b5d -r 7ded3547547731daa7d09b47bf601ce2f2209513 lib/tool_shed/util/tool_dependency_util.py
--- a/lib/tool_shed/util/tool_dependency_util.py
+++ b/lib/tool_shed/util/tool_dependency_util.py
@@ -128,9 +128,9 @@
platform_dict[ 'architecture' ] = machine.lower()
return platform_dict
-def get_tool_dependency( trans, id ):
+def get_tool_dependency( app, id ):
"""Get a tool_dependency from the database via id"""
- return trans.install_model.context.query( trans.install_model.ToolDependency ).get( trans.security.decode_id( id ) )
+ return app.install_model.context.query( app.install_model.ToolDependency ).get( app.security.decode_id( id ) )
def get_tool_dependency_by_name_type_repository( app, repository, name, type ):
context = app.install_model.context
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Eliminate the use of trans in the tool shed's search_util.py module.
by commits-noreply@bitbucket.org 17 Jul '14
by commits-noreply@bitbucket.org 17 Jul '14
17 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/027d160ca90a/
Changeset: 027d160ca90a
User: greg
Date: 2014-07-17 21:46:42
Summary: Eliminate the use of trans in the tool shed's search_util.py module.
Affected #: 2 files
diff -r f65eb16fdf1e0b7a6f3483373c32765107c3b2b6 -r 027d160ca90a264d1208184488479ec68e999b5d lib/galaxy/webapps/tool_shed/controllers/repository.py
--- a/lib/galaxy/webapps/tool_shed/controllers/repository.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/repository.py
@@ -1347,7 +1347,7 @@
match_tuples = []
ok = True
if tool_ids or tool_names or tool_versions:
- ok, match_tuples = search_util.search_repository_metadata( trans,
+ ok, match_tuples = search_util.search_repository_metadata( trans.app,
exact_matches_checked,
tool_ids=tool_ids,
tool_names=tool_names,
@@ -1437,9 +1437,14 @@
match_tuples = []
ok = True
if workflow_names:
- ok, match_tuples = search_util.search_repository_metadata( trans, exact_matches_checked, workflow_names=workflow_names )
+ ok, match_tuples = search_util.search_repository_metadata( trans.app,
+ exact_matches_checked,
+ workflow_names=workflow_names )
else:
- ok, match_tuples = search_util.search_repository_metadata( trans, exact_matches_checked, workflow_names=[], all_workflows=True )
+ ok, match_tuples = search_util.search_repository_metadata( trans.app,
+ exact_matches_checked,
+ workflow_names=[],
+ all_workflows=True )
if ok:
kwd[ 'match_tuples' ] = match_tuples
if trans.webapp.name == 'galaxy':
diff -r f65eb16fdf1e0b7a6f3483373c32765107c3b2b6 -r 027d160ca90a264d1208184488479ec68e999b5d lib/tool_shed/util/search_util.py
--- a/lib/tool_shed/util/search_util.py
+++ b/lib/tool_shed/util/search_util.py
@@ -81,15 +81,17 @@
match_tuples.append( ( repository_metadata.repository_id, repository_metadata.changeset_revision ) )
return match_tuples
-def search_repository_metadata( trans, exact_matches_checked, tool_ids='', tool_names='', tool_versions='', workflow_names='', all_workflows=False ):
+def search_repository_metadata( app, exact_matches_checked, tool_ids='', tool_names='', tool_versions='',
+ workflow_names='', all_workflows=False ):
+ sa_session=app.model.context.current
match_tuples = []
ok = True
if tool_ids or tool_names or tool_versions:
- for repository_metadata in trans.sa_session.query( trans.model.RepositoryMetadata ) \
- .filter( trans.model.RepositoryMetadata.table.c.includes_tools == True ) \
- .join( trans.model.Repository ) \
- .filter( and_( trans.model.Repository.table.c.deleted == False,
- trans.model.Repository.table.c.deprecated == False ) ):
+ for repository_metadata in sa_session.query( app.model.RepositoryMetadata ) \
+ .filter( app.model.RepositoryMetadata.table.c.includes_tools == True ) \
+ .join( app.model.Repository ) \
+ .filter( and_( app.model.Repository.table.c.deleted == False,
+ app.model.Repository.table.c.deprecated == False ) ):
metadata = repository_metadata.metadata
if metadata:
tools = metadata.get( 'tools', [] )
@@ -140,11 +142,11 @@
else:
ok = False
elif workflow_names or all_workflows:
- for repository_metadata in trans.sa_session.query( trans.model.RepositoryMetadata ) \
- .filter( trans.model.RepositoryMetadata.table.c.includes_workflows == True ) \
- .join( trans.model.Repository ) \
- .filter( and_( trans.model.Repository.table.c.deleted == False,
- trans.model.Repository.table.c.deprecated == False ) ):
+ for repository_metadata in sa_session.query( app.model.RepositoryMetadata ) \
+ .filter( app.model.RepositoryMetadata.table.c.includes_workflows == True ) \
+ .join( app.model.Repository ) \
+ .filter( and_( app.model.Repository.table.c.deleted == False,
+ app.model.Repository.table.c.deprecated == False ) ):
metadata = repository_metadata.metadata
if metadata:
# metadata[ 'workflows' ] is a list of tuples where each contained tuple is
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Eliminate the use of trans in the tool shed's review_util.py module.
by commits-noreply@bitbucket.org 17 Jul '14
by commits-noreply@bitbucket.org 17 Jul '14
17 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/f65eb16fdf1e/
Changeset: f65eb16fdf1e
User: greg
Date: 2014-07-17 21:41:56
Summary: Eliminate the use of trans in the tool shed's review_util.py module.
Affected #: 3 files
diff -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 -r f65eb16fdf1e0b7a6f3483373c32765107c3b2b6 lib/galaxy/webapps/tool_shed/controllers/repository_review.py
--- a/lib/galaxy/webapps/tool_shed/controllers/repository_review.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/repository_review.py
@@ -34,7 +34,7 @@
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
encoded_review_id = kwd[ 'id' ]
- review = review_util.get_review( trans, encoded_review_id )
+ review = review_util.get_review( trans.app, encoded_review_id )
if kwd.get( 'approve_repository_review_button', False ):
approved_select_field_name = '%s%sapproved' % ( encoded_review_id, STRSEP )
approved_select_field_value = str( kwd[ approved_select_field_name ] )
@@ -67,7 +67,7 @@
def browse_review( self, trans, **kwd ):
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
- review = review_util.get_review( trans, kwd[ 'id' ] )
+ review = review_util.get_review( trans.app, kwd[ 'id' ] )
repository = review.repository
repo = hg_util.get_repo_for_repository( trans.app, repository=repository, repo_path=None, create=False )
rev, changeset_revision_label = hg_util.get_rev_label_from_changeset_revision( repo, review.changeset_revision )
@@ -104,7 +104,7 @@
if not name or not description:
message = 'Enter a valid name and a description'
status = 'error'
- elif review_util.get_component_by_name( trans, name ):
+ elif review_util.get_component_by_name( trans.app, name ):
message = 'A component with that name already exists'
status = 'error'
else:
@@ -137,7 +137,7 @@
if changeset_revision:
# Make sure there is not already a review of the revision by the user.
repository = suc.get_repository_in_tool_shed( trans.app, repository_id )
- if review_util.get_review_by_repository_id_changeset_revision_user_id( trans=trans,
+ if review_util.get_review_by_repository_id_changeset_revision_user_id( app=trans.app,
repository_id=repository_id,
changeset_revision=changeset_revision,
user_id=trans.security.encode_id( trans.user.id ) ):
@@ -145,7 +145,9 @@
status = "error"
else:
# See if there are any reviews for previous changeset revisions that the user can copy.
- if not create_without_copying and not previous_review_id and review_util.has_previous_repository_reviews( trans, repository, changeset_revision ):
+ if not create_without_copying and \
+ not previous_review_id and \
+ review_util.has_previous_repository_reviews( trans.app, repository, changeset_revision ):
return trans.response.send_redirect( web.url_for( controller='repository_review',
action='select_previous_review',
**kwd ) )
@@ -163,7 +165,7 @@
trans.sa_session.add( review )
trans.sa_session.flush()
if previous_review_id:
- review_to_copy = review_util.get_review( trans, previous_review_id )
+ review_to_copy = review_util.get_review( trans.app, previous_review_id )
self.copy_review( trans, review_to_copy, review )
review_id = trans.security.encode_id( review.id )
message = "Begin your review of revision <b>%s</b> of repository <b>%s</b>." \
@@ -199,7 +201,7 @@
action='manage_categories',
message=message,
status='error' ) )
- component = review_util.get_component( trans, id )
+ component = review_util.get_component( trans.app, id )
if kwd.get( 'edit_component_button', False ):
new_description = kwd.get( 'description', '' ).strip()
if component.description != new_description:
@@ -224,9 +226,9 @@
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
review_id = kwd.get( 'id', None )
- review = review_util.get_review( trans, review_id )
+ review = review_util.get_review( trans.app, review_id )
components_dict = odict()
- for component in review_util.get_components( trans ):
+ for component in review_util.get_components( trans.app ):
components_dict[ component.name ] = dict( component=component, component_review=None )
repository = review.repository
repo = hg_util.get_repo_for_repository( trans.app, repository=repository, repo_path=None, create=False )
@@ -276,8 +278,11 @@
approved = str( v )
elif component_review_attr == 'rating':
rating = int( str( v ) )
- component = review_util.get_component( trans, component_id )
- component_review = review_util.get_component_review_by_repository_review_id_component_id( trans, review_id, component_id )
+ component = review_util.get_component( trans.app, component_id )
+ component_review = \
+ review_util.get_component_review_by_repository_review_id_component_id( trans.app,
+ review_id,
+ component_id )
if component_review:
# See if the existing component review should be updated.
if component_review.comment != comment or \
@@ -477,7 +482,10 @@
rev, changeset_revision_label = hg_util.get_rev_label_from_changeset_revision( repo, changeset_revision )
if changeset_revision in reviewed_revision_hashes:
# Find the review for this changeset_revision
- repository_reviews = review_util.get_reviews_by_repository_id_changeset_revision( trans, repository_id, changeset_revision )
+ repository_reviews = \
+ review_util.get_reviews_by_repository_id_changeset_revision( trans.app,
+ repository_id,
+ changeset_revision )
# Determine if the current user can add a review to this revision.
can_add_review = trans.user not in [ repository_review.user for repository_review in repository_reviews ]
repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans.app, repository_id, changeset_revision )
@@ -515,7 +523,9 @@
repo = hg_util.get_repo_for_repository( trans.app, repository=repository, repo_path=None, create=False )
installable = changeset_revision in [ metadata_revision.changeset_revision for metadata_revision in repository.metadata_revisions ]
rev, changeset_revision_label = hg_util.get_rev_label_from_changeset_revision( repo, changeset_revision )
- reviews = review_util.get_reviews_by_repository_id_changeset_revision( trans, repository_id, changeset_revision )
+ reviews = review_util.get_reviews_by_repository_id_changeset_revision( trans.app,
+ repository_id,
+ changeset_revision )
return trans.fill_template( '/webapps/tool_shed/repository_review/reviews_of_changeset_revision.mako',
repository=repository,
changeset_revision=changeset_revision,
@@ -534,7 +544,7 @@
if 'operation' in kwd:
operation = kwd['operation'].lower()
# The value of the received id is the encoded review id.
- review = review_util.get_review( trans, kwd[ 'id' ] )
+ review = review_util.get_review( trans.app, kwd[ 'id' ] )
repository = review.repository
kwd[ 'id' ] = trans.security.encode_id( repository.id )
if operation == "inspect repository revisions":
@@ -578,7 +588,9 @@
repository = suc.get_repository_in_tool_shed( trans.app, kwd[ 'id' ] )
changeset_revision = kwd.get( 'changeset_revision', None )
repo = hg_util.get_repo_for_repository( trans.app, repository=repository, repo_path=None, create=False )
- previous_reviews_dict = review_util.get_previous_repository_reviews( trans, repository, changeset_revision )
+ previous_reviews_dict = review_util.get_previous_repository_reviews( trans.app,
+ repository,
+ changeset_revision )
rev, changeset_revision_label = hg_util.get_rev_label_from_changeset_revision( repo, changeset_revision )
return trans.fill_template( '/webapps/tool_shed/repository_review/select_previous_review.mako',
repository=repository,
diff -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 -r f65eb16fdf1e0b7a6f3483373c32765107c3b2b6 lib/tool_shed/util/review_util.py
--- a/lib/tool_shed/util/review_util.py
+++ b/lib/tool_shed/util/review_util.py
@@ -7,87 +7,110 @@
log = logging.getLogger( __name__ )
-def can_browse_repository_reviews( trans, repository ):
- """Determine if there are any reviews of the received repository for which the current user has permission to browse any component reviews."""
- user = trans.user
+def can_browse_repository_reviews( app, user, repository ):
+ """
+ Determine if there are any reviews of the received repository for which the
+ current user has permission to browse any component reviews.
+ """
if user:
for review in repository.reviews:
for component_review in review.component_reviews:
- if trans.app.security_agent.user_can_browse_component_review( trans.app, repository, component_review, user ):
+ if app.security_agent.user_can_browse_component_review( app,
+ repository,
+ component_review, user ):
return True
return False
-def changeset_revision_reviewed_by_user( trans, user, repository, changeset_revision ):
+def changeset_revision_reviewed_by_user( user, repository, changeset_revision ):
"""Determine if the current changeset revision has been reviewed by the current user."""
for review in repository.reviews:
if review.changeset_revision == changeset_revision and review.user == user:
return True
return False
-def get_component( trans, id ):
+def get_component( app, id ):
"""Get a component from the database."""
- return trans.sa_session.query( trans.model.Component ).get( trans.security.decode_id( id ) )
+ sa_session = app.model.context.current
+ return sa_session.query( app.model.Component ).get( app.security.decode_id( id ) )
-def get_component_review( trans, id ):
+def get_component_review( app, id ):
"""Get a component_review from the database"""
- return trans.sa_session.query( trans.model.ComponentReview ).get( trans.security.decode_id( id ) )
+ sa_session = app.model.context.current
+ return sa_session.query( app.model.ComponentReview ).get( app.security.decode_id( id ) )
-def get_component_by_name( trans, name ):
+def get_component_by_name( app, name ):
"""Get a component from the database via a name."""
- return trans.sa_session.query( trans.app.model.Component ) \
- .filter( trans.app.model.Component.table.c.name==name ) \
- .first()
+ sa_session = app.model.context.current
+ return sa_session.query( app.app.model.Component ) \
+ .filter( app.app.model.Component.table.c.name==name ) \
+ .first()
-def get_component_review_by_repository_review_id_component_id( trans, repository_review_id, component_id ):
+def get_component_review_by_repository_review_id_component_id( app, repository_review_id, component_id ):
"""Get a component_review from the database via repository_review_id and component_id."""
- return trans.sa_session.query( trans.model.ComponentReview ) \
- .filter( and_( trans.model.ComponentReview.table.c.repository_review_id == trans.security.decode_id( repository_review_id ),
- trans.model.ComponentReview.table.c.component_id == trans.security.decode_id( component_id ) ) ) \
- .first()
+ sa_session = app.model.context.current
+ return sa_session.query( app.model.ComponentReview ) \
+ .filter( and_( app.model.ComponentReview.table.c.repository_review_id == app.security.decode_id( repository_review_id ),
+ app.model.ComponentReview.table.c.component_id == app.security.decode_id( component_id ) ) ) \
+ .first()
-def get_components( trans ):
- return trans.sa_session.query( trans.app.model.Component ) \
- .order_by( trans.app.model.Component.name ) \
- .all()
+def get_components( app ):
+ sa_session = app.model.context.current
+ return sa_session.query( app.model.Component ) \
+ .order_by( app.model.Component.name ) \
+ .all()
-def get_previous_repository_reviews( trans, repository, changeset_revision ):
- """Return an ordered dictionary of repository reviews up to and including the received changeset revision."""
- repo = hg_util.get_repo_for_repository( trans.app, repository=repository, repo_path=None, create=False )
+def get_previous_repository_reviews( app, repository, changeset_revision ):
+ """
+ Return an ordered dictionary of repository reviews up to and including the
+ received changeset revision.
+ """
+ repo = hg_util.get_repo_for_repository( app, repository=repository, repo_path=None, create=False )
reviewed_revision_hashes = [ review.changeset_revision for review in repository.reviews ]
previous_reviews_dict = odict()
for changeset in hg_util.reversed_upper_bounded_changelog( repo, changeset_revision ):
previous_changeset_revision = str( repo.changectx( changeset ) )
if previous_changeset_revision in reviewed_revision_hashes:
- previous_rev, previous_changeset_revision_label = hg_util.get_rev_label_from_changeset_revision( repo, previous_changeset_revision )
- revision_reviews = get_reviews_by_repository_id_changeset_revision( trans,
- trans.security.encode_id( repository.id ),
+ previous_rev, previous_changeset_revision_label = \
+ hg_util.get_rev_label_from_changeset_revision( repo, previous_changeset_revision )
+ revision_reviews = get_reviews_by_repository_id_changeset_revision( app,
+ app.security.encode_id( repository.id ),
previous_changeset_revision )
- previous_reviews_dict[ previous_changeset_revision ] = dict( changeset_revision_label=previous_changeset_revision_label,
- reviews=revision_reviews )
+ previous_reviews_dict[ previous_changeset_revision ] = \
+ dict( changeset_revision_label=previous_changeset_revision_label,
+ reviews=revision_reviews )
return previous_reviews_dict
-def get_review( trans, id ):
+def get_review( app, id ):
"""Get a repository_review from the database via id."""
- return trans.sa_session.query( trans.model.RepositoryReview ).get( trans.security.decode_id( id ) )
+ sa_session = app.model.context.current
+ return sa_session.query( app.model.RepositoryReview ).get( app.security.decode_id( id ) )
-def get_review_by_repository_id_changeset_revision_user_id( trans, repository_id, changeset_revision, user_id ):
- """Get a repository_review from the database via repository id, changeset_revision and user_id."""
- return trans.sa_session.query( trans.model.RepositoryReview ) \
- .filter( and_( trans.model.RepositoryReview.repository_id == trans.security.decode_id( repository_id ),
- trans.model.RepositoryReview.changeset_revision == changeset_revision,
- trans.model.RepositoryReview.user_id == trans.security.decode_id( user_id ) ) ) \
- .first()
+def get_review_by_repository_id_changeset_revision_user_id( app, repository_id, changeset_revision, user_id ):
+ """
+ Get a repository_review from the database via repository id, changeset_revision
+ and user_id.
+ """
+ sa_session = app.model.context.current
+ return sa_session.query( app.model.RepositoryReview ) \
+ .filter( and_( app.model.RepositoryReview.repository_id == app.security.decode_id( repository_id ),
+ app.model.RepositoryReview.changeset_revision == changeset_revision,
+ app.model.RepositoryReview.user_id == app.security.decode_id( user_id ) ) ) \
+ .first()
-def get_reviews_by_repository_id_changeset_revision( trans, repository_id, changeset_revision ):
+def get_reviews_by_repository_id_changeset_revision( app, repository_id, changeset_revision ):
"""Get all repository_reviews from the database via repository id and changeset_revision."""
- return trans.sa_session.query( trans.model.RepositoryReview ) \
- .filter( and_( trans.model.RepositoryReview.repository_id == trans.security.decode_id( repository_id ),
- trans.model.RepositoryReview.changeset_revision == changeset_revision ) ) \
- .all()
+ sa_session = app.model.context.current
+ return sa_session.query( app.model.RepositoryReview ) \
+ .filter( and_( app.model.RepositoryReview.repository_id == app.security.decode_id( repository_id ),
+ app.model.RepositoryReview.changeset_revision == changeset_revision ) ) \
+ .all()
-def has_previous_repository_reviews( trans, repository, changeset_revision ):
- """Determine if a repository has a changeset revision review prior to the received changeset revision."""
- repo = hg_util.get_repo_for_repository( trans.app, repository=repository, repo_path=None, create=False )
+def has_previous_repository_reviews( app, repository, changeset_revision ):
+ """
+ Determine if a repository has a changeset revision review prior to the
+ received changeset revision.
+ """
+ repo = hg_util.get_repo_for_repository( app, repository=repository, repo_path=None, create=False )
reviewed_revision_hashes = [ review.changeset_revision for review in repository.reviews ]
for changeset in hg_util.reversed_upper_bounded_changelog( repo, changeset_revision ):
previous_changeset_revision = str( repo.changectx( changeset ) )
diff -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 -r f65eb16fdf1e0b7a6f3483373c32765107c3b2b6 templates/webapps/tool_shed/common/repository_actions_menu.mako
--- a/templates/webapps/tool_shed/common/repository_actions_menu.mako
+++ b/templates/webapps/tool_shed/common/repository_actions_menu.mako
@@ -34,7 +34,7 @@
can_browse_contents = not is_new
- if can_browse_repository_reviews( trans, repository ):
+ if can_browse_repository_reviews( trans.app, trans.user, repository ):
can_browse_reviews = True
else:
can_browse_reviews = False
@@ -79,7 +79,7 @@
can_review_repository = True
else:
can_review_repository = False
- if changeset_revision_reviewed_by_user( trans, trans.user, repository, changeset_revision ):
+ if changeset_revision_reviewed_by_user( trans.user, repository, changeset_revision ):
reviewed_by_user = True
else:
reviewed_by_user = False
@@ -88,7 +88,7 @@
reviewed_by_user = False
if reviewed_by_user:
- review = get_review_by_repository_id_changeset_revision_user_id( trans=trans,
+ review = get_review_by_repository_id_changeset_revision_user_id( app=trans.app,
repository_id=trans.security.encode_id( repository.id ),
changeset_revision=changeset_revision,
user_id=trans.security.encode_id( trans.user.id ) )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Eliminate the use of trans in functions within the hg_util.py module.
by commits-noreply@bitbucket.org 17 Jul '14
by commits-noreply@bitbucket.org 17 Jul '14
17 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/49e900b2a871/
Changeset: 49e900b2a871
User: greg
Date: 2014-07-17 21:22:51
Summary: Eliminate the use of trans in functions within the hg_util.py module.
Affected #: 6 files
diff -r 23f0c41b1cd7d7ef2087e1eec387cd3b116923fc -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 lib/galaxy/webapps/tool_shed/controllers/repository.py
--- a/lib/galaxy/webapps/tool_shed/controllers/repository.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/repository.py
@@ -1273,7 +1273,7 @@
else:
containers_dict = None
export_repository_dependencies_check_box = None
- revision_label = hg_util.get_revision_label( trans, repository, changeset_revision, include_date=True )
+ revision_label = hg_util.get_revision_label( trans.app, repository, changeset_revision, include_date=True )
return trans.fill_template( "/webapps/tool_shed/repository/export_repository.mako",
changeset_revision=changeset_revision,
containers_dict=containers_dict,
@@ -2374,7 +2374,7 @@
selected_value=changeset_revision,
add_id_to_name=False,
downloadable=False )
- revision_label = hg_util.get_revision_label( trans, repository, repository.tip( trans.app ), include_date=False )
+ revision_label = hg_util.get_revision_label( trans.app, repository, repository.tip( trans.app ), include_date=False )
repository_metadata = None
metadata = None
is_malicious = False
@@ -2383,7 +2383,7 @@
if changeset_revision != hg_util.INITIAL_CHANGELOG_HASH:
repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans.app, id, changeset_revision )
if repository_metadata:
- revision_label = hg_util.get_revision_label( trans, repository, changeset_revision, include_date=False )
+ revision_label = hg_util.get_revision_label( trans.app, repository, changeset_revision, include_date=False )
metadata = repository_metadata.metadata
is_malicious = repository_metadata.malicious
else:
@@ -2394,7 +2394,7 @@
if previous_changeset_revision != hg_util.INITIAL_CHANGELOG_HASH:
repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans.app, id, previous_changeset_revision )
if repository_metadata:
- revision_label = hg_util.get_revision_label( trans, repository, previous_changeset_revision, include_date=False )
+ revision_label = hg_util.get_revision_label( trans.app, repository, previous_changeset_revision, include_date=False )
metadata = repository_metadata.metadata
is_malicious = repository_metadata.malicious
changeset_revision = previous_changeset_revision
@@ -2630,7 +2630,7 @@
repository_metadata_id = None
metadata = None
repository_dependencies = None
- revision_label = hg_util.get_revision_label( trans, repository, changeset_revision, include_date=True )
+ revision_label = hg_util.get_revision_label( trans.app, repository, changeset_revision, include_date=True )
changeset_revision_select_field = grids_util.build_changeset_revision_select_field( trans,
repository,
selected_value=changeset_revision,
@@ -2725,7 +2725,7 @@
changeset_revision,
metadata_only=True )
repository_type_select_field = rt_util.build_repository_type_select_field( trans, repository=repository )
- revision_label = hg_util.get_revision_label( trans, repository, changeset_revision, include_date=True )
+ revision_label = hg_util.get_revision_label( trans.app, repository, changeset_revision, include_date=True )
return trans.fill_template( '/webapps/tool_shed/repository/rate_repository.mako',
repository=repository,
metadata=metadata,
@@ -3317,7 +3317,7 @@
selected_value=changeset_revision,
add_id_to_name=False,
downloadable=False )
- revision_label = hg_util.get_revision_label( trans, repository, changeset_revision, include_date=False )
+ revision_label = hg_util.get_revision_label( trans.app, repository, changeset_revision, include_date=False )
repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans.app, id, changeset_revision )
if repository_metadata:
metadata = repository_metadata.metadata
@@ -3379,7 +3379,7 @@
tool = None
guid = None
original_tool_data_path = trans.app.config.tool_data_path
- revision_label = hg_util.get_revision_label( trans, repository, changeset_revision, include_date=False )
+ revision_label = hg_util.get_revision_label( trans.app, repository, changeset_revision, include_date=False )
repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans.app, repository_id, changeset_revision )
if repository_metadata:
repository_metadata_id = trans.security.encode_id( repository_metadata.id )
diff -r 23f0c41b1cd7d7ef2087e1eec387cd3b116923fc -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 lib/tool_shed/grids/admin_grids.py
--- a/lib/tool_shed/grids/admin_grids.py
+++ b/lib/tool_shed/grids/admin_grids.py
@@ -426,7 +426,7 @@
def get_value( self, trans, grid, repository_metadata ):
repository = repository_metadata.repository
- return hg_util.get_revision_label( trans,
+ return hg_util.get_revision_label( trans.app,
repository,
repository_metadata.changeset_revision,
include_date=True,
diff -r 23f0c41b1cd7d7ef2087e1eec387cd3b116923fc -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 lib/tool_shed/grids/repository_grids.py
--- a/lib/tool_shed/grids/repository_grids.py
+++ b/lib/tool_shed/grids/repository_grids.py
@@ -1346,7 +1346,7 @@
def get_value( self, trans, grid, repository_metadata ):
repository = repository_metadata.repository
changeset_revision = repository_metadata.changeset_revision
- changeset_revision_label = hg_util.get_revision_label( trans, repository, changeset_revision, include_date=True )
+ changeset_revision_label = hg_util.get_revision_label( trans.app, repository, changeset_revision, include_date=True )
return changeset_revision_label
diff -r 23f0c41b1cd7d7ef2087e1eec387cd3b116923fc -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 lib/tool_shed/grids/repository_review_grids.py
--- a/lib/tool_shed/grids/repository_review_grids.py
+++ b/lib/tool_shed/grids/repository_review_grids.py
@@ -79,7 +79,7 @@
rval = ''
for repository_metadata in repository_metadata_revisions:
rev, label, changeset_revision = \
- hg_util.get_rev_label_changeset_revision_from_repository_metadata( trans,
+ hg_util.get_rev_label_changeset_revision_from_repository_metadata( trans.app,
repository_metadata,
repository=repository,
include_date=True,
@@ -304,7 +304,7 @@
rval += 'edit_review'
else:
rval +='browse_review'
- revision_label = hg_util.get_revision_label( trans,
+ revision_label = hg_util.get_revision_label( trans.app,
review.repository,
review.changeset_revision,
include_date=True,
diff -r 23f0c41b1cd7d7ef2087e1eec387cd3b116923fc -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 lib/tool_shed/grids/util.py
--- a/lib/tool_shed/grids/util.py
+++ b/lib/tool_shed/grids/util.py
@@ -58,7 +58,7 @@
repository_metadata_revisions = repository.metadata_revisions
for repository_metadata in repository_metadata_revisions:
rev, label, changeset_revision = \
- hg_util.get_rev_label_changeset_revision_from_repository_metadata( trans,
+ hg_util.get_rev_label_changeset_revision_from_repository_metadata( trans.app,
repository_metadata,
repository=repository,
include_date=True,
diff -r 23f0c41b1cd7d7ef2087e1eec387cd3b116923fc -r 49e900b2a87108eb1e8b2bbea8a1bf6daedd6328 lib/tool_shed/util/hg_util.py
--- a/lib/tool_shed/util/hg_util.py
+++ b/lib/tool_shed/util/hg_util.py
@@ -249,12 +249,12 @@
reversed_changelog.insert( 0, changeset )
return reversed_changelog
-def get_revision_label( trans, repository, changeset_revision, include_date=True, include_hash=True ):
+def get_revision_label( app, repository, changeset_revision, include_date=True, include_hash=True ):
"""
Return a string consisting of the human read-able changeset rev and the changeset revision string
which includes the revision date if the receive include_date is True.
"""
- repo = get_repo_for_repository( trans.app, repository=repository, repo_path=None )
+ repo = get_repo_for_repository( app, repository=repository, repo_path=None )
ctx = get_changectx_for_changeset( repo, changeset_revision )
if ctx:
return get_revision_label_from_ctx( ctx, include_date=include_date, include_hash=include_hash )
@@ -264,11 +264,11 @@
else:
return "-1"
-def get_rev_label_changeset_revision_from_repository_metadata( trans, repository_metadata, repository=None,
+def get_rev_label_changeset_revision_from_repository_metadata( app, repository_metadata, repository=None,
include_date=True, include_hash=True ):
if repository is None:
repository = repository_metadata.repository
- repo = hg.repository( get_configured_ui(), repository.repo_path( trans.app ) )
+ repo = hg.repository( get_configured_ui(), repository.repo_path( app ) )
changeset_revision = repository_metadata.changeset_revision
ctx = get_changectx_for_changeset( repo, changeset_revision )
if ctx:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Make sure dir is not None when used to create a path - than Nicolas Lapalu.
by commits-noreply@bitbucket.org 17 Jul '14
by commits-noreply@bitbucket.org 17 Jul '14
17 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/23f0c41b1cd7/
Changeset: 23f0c41b1cd7
User: greg
Date: 2014-07-17 19:38:53
Summary: Make sure dir is not None when used to create a path - than Nicolas Lapalu.
Affected #: 1 file
diff -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a -r 23f0c41b1cd7d7ef2087e1eec387cd3b116923fc lib/tool_shed/galaxy_install/install_manager.py
--- a/lib/tool_shed/galaxy_install/install_manager.py
+++ b/lib/tool_shed/galaxy_install/install_manager.py
@@ -100,6 +100,8 @@
# We're in stage 2 of the installation process. The package has been down-loaded, so we can
# now perform all of the actions defined for building it.
for action_tup in filtered_actions:
+ if dir is None:
+ dir = ''
current_dir = os.path.abspath( os.path.join( work_dir, dir ) )
with lcd( current_dir ):
action_type, action_dict = action_tup
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Add classes for managing tool shed repository metadata.
by commits-noreply@bitbucket.org 17 Jul '14
by commits-noreply@bitbucket.org 17 Jul '14
17 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/4d2e744f8efc/
Changeset: 4d2e744f8efc
User: greg
Date: 2014-07-17 19:20:49
Summary: Add classes for managing tool shed repository metadata.
Affected #: 20 files
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
--- a/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
+++ b/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
@@ -11,6 +11,7 @@
from galaxy.web.base.controller import BaseAPIController
from tool_shed.galaxy_install.install_manager import InstallRepositoryManager
+from tool_shed.galaxy_install.metadata.installed_repository_metadata_manager import InstalledRepositoryMetadataManager
from tool_shed.galaxy_install.repair_repository_manager import RepairRepositoryManager
from tool_shed.util import common_util
from tool_shed.util import encoding_util
@@ -417,7 +418,8 @@
for repository in query:
repository_id = trans.security.encode_id( repository.id )
try:
- invalid_file_tups, metadata_dict = metadata_util.reset_all_metadata_on_installed_repository( trans.app, repository_id )
+ irmm = InstalledRepositoryMetadataManager( trans.app )
+ invalid_file_tups, metadata_dict = irmm.reset_all_metadata_on_installed_repository( repository_id )
if invalid_file_tups:
message = tool_util.generate_message_for_invalid_tools( trans.app,
invalid_file_tups,
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -16,7 +16,6 @@
from tool_shed.util import datatype_util
from tool_shed.util import encoding_util
from tool_shed.util import hg_util
-from tool_shed.util import metadata_util
from tool_shed.util import readme_util
from tool_shed.util import repository_maintenance_util
from tool_shed.util import shed_util_common as suc
@@ -27,8 +26,9 @@
from tool_shed.galaxy_install import dependency_display
from tool_shed.galaxy_install import install_manager
+from tool_shed.galaxy_install.grids import admin_toolshed_grids
+from tool_shed.galaxy_install.metadata.installed_repository_metadata_manager import InstalledRepositoryMetadataManager
from tool_shed.galaxy_install.repair_repository_manager import RepairRepositoryManager
-import tool_shed.galaxy_install.grids.admin_toolshed_grids as admin_toolshed_grids
from tool_shed.galaxy_install.repository_dependencies import repository_dependency_manager
log = logging.getLogger( __name__ )
@@ -1704,7 +1704,8 @@
@web.require_admin
def reset_metadata_on_selected_installed_repositories( self, trans, **kwd ):
if 'reset_metadata_on_selected_repositories_button' in kwd:
- message, status = metadata_util.reset_metadata_on_selected_repositories( trans.app, trans.user, **kwd )
+ irmm = InstalledRepositoryMetadataManager( trans.app )
+ message, status = irmm.reset_metadata_on_selected_repositories( trans.user, **kwd )
else:
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
@@ -1723,17 +1724,17 @@
tool_path, relative_install_dir = repository.get_tool_relative_path( trans.app )
if relative_install_dir:
original_metadata_dict = repository.metadata
+ irmm = InstalledRepositoryMetadataManager( trans.app )
metadata_dict, invalid_file_tups = \
- metadata_util.generate_metadata_for_changeset_revision( app=trans.app,
- repository=repository,
- changeset_revision=repository.changeset_revision,
- repository_clone_url=repository_clone_url,
- shed_config_dict = repository.get_shed_config_dict( trans.app ),
- relative_install_dir=relative_install_dir,
- repository_files_dir=None,
- resetting_all_metadata_on_repository=False,
- updating_installed_repository=False,
- persist=False )
+ irmm.generate_metadata_for_changeset_revision( repository=repository,
+ changeset_revision=repository.changeset_revision,
+ repository_clone_url=repository_clone_url,
+ shed_config_dict = repository.get_shed_config_dict( trans.app ),
+ relative_install_dir=relative_install_dir,
+ repository_files_dir=None,
+ resetting_all_metadata_on_repository=False,
+ updating_installed_repository=False,
+ persist=False )
repository.metadata = metadata_dict
if metadata_dict != original_metadata_dict:
suc.update_in_shed_tool_config( trans.app, repository )
@@ -1926,17 +1927,17 @@
if repository.includes_data_managers:
data_manager_util.remove_from_data_manager( trans.app, repository )
# Update the repository metadata.
+ irmm = InstalledRepositoryMetadataManager( trans.app )
metadata_dict, invalid_file_tups = \
- metadata_util.generate_metadata_for_changeset_revision( app=trans.app,
- repository=repository,
- changeset_revision=latest_changeset_revision,
- repository_clone_url=repository_clone_url,
- shed_config_dict=repository.get_shed_config_dict( trans.app ),
- relative_install_dir=relative_install_dir,
- repository_files_dir=None,
- resetting_all_metadata_on_repository=False,
- updating_installed_repository=True,
- persist=True )
+ irmm.generate_metadata_for_changeset_revision( repository=repository,
+ changeset_revision=latest_changeset_revision,
+ repository_clone_url=repository_clone_url,
+ shed_config_dict=repository.get_shed_config_dict( trans.app ),
+ relative_install_dir=relative_install_dir,
+ repository_files_dir=None,
+ resetting_all_metadata_on_repository=False,
+ updating_installed_repository=True,
+ persist=True )
if 'tools' in metadata_dict:
tool_panel_dict = metadata_dict.get( 'tool_panel_section', None )
if tool_panel_dict is None:
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/galaxy/webapps/tool_shed/api/repositories.py
--- a/lib/galaxy/webapps/tool_shed/api/repositories.py
+++ b/lib/galaxy/webapps/tool_shed/api/repositories.py
@@ -4,19 +4,21 @@
import tarfile
import tempfile
from time import strftime
-from galaxy import eggs
+
from galaxy import util
from galaxy import web
from galaxy.model.orm import and_
from galaxy.web.base.controller import BaseAPIController
from galaxy.web.base.controller import HTTPBadRequest
from galaxy.web.framework.helpers import time_ago
+
from tool_shed.capsule import capsule_manager
-import tool_shed.repository_types.util as rt_util
+from tool_shed.metadata import repository_metadata_manager
+from tool_shed.repository_types import util as rt_util
+
from tool_shed.util import basic_util
from tool_shed.util import encoding_util
from tool_shed.util import hg_util
-from tool_shed.util import metadata_util
from tool_shed.util import repository_maintenance_util
from tool_shed.util import shed_util_common as suc
from tool_shed.util import tool_util
@@ -428,8 +430,9 @@
log.debug( "Resetting metadata on repository %s" % str( repository.name ) )
repository_id = trans.security.encode_id( repository.id )
try:
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
invalid_file_tups, metadata_dict = \
- metadata_util.reset_all_metadata_on_repository_in_tool_shed( trans.app, trans.user, repository_id )
+ rmm.reset_all_metadata_on_repository_in_tool_shed( trans.user, repository_id )
if invalid_file_tups:
message = tool_util.generate_message_for_invalid_tools( trans.app,
invalid_file_tups,
@@ -508,10 +511,10 @@
results = dict( start_time=start_time,
repository_status=[] )
try:
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
invalid_file_tups, metadata_dict = \
- metadata_util.reset_all_metadata_on_repository_in_tool_shed( trans.app,
- trans.user,
- trans.security.encode_id( repository.id ) )
+ rmm.reset_all_metadata_on_repository_in_tool_shed( trans.user,
+ trans.security.encode_id( repository.id ) )
if invalid_file_tups:
message = tool_util.generate_message_for_invalid_tools( trans.app,
invalid_file_tups,
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/galaxy/webapps/tool_shed/controllers/admin.py
--- a/lib/galaxy/webapps/tool_shed/controllers/admin.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/admin.py
@@ -1,13 +1,18 @@
import logging
+
+from galaxy import util
+from galaxy.util import inflector
+from galaxy import web
+
from galaxy.web.base.controller import BaseUIController
from galaxy.web.base.controllers.admin import Admin
-from galaxy import util
-from galaxy import web
-from galaxy.util import inflector
-import tool_shed.util.shed_util_common as suc
-import tool_shed.util.metadata_util as metadata_util
+
+import tool_shed.grids.admin_grids as admin_grids
+from tool_shed.metadata import repository_metadata_manager
+
+from tool_shed.util import metadata_util
from tool_shed.util import repository_maintenance_util
-import tool_shed.grids.admin_grids as admin_grids
+from tool_shed.util import shed_util_common as suc
log = logging.getLogger( __name__ )
@@ -344,7 +349,8 @@
@web.require_admin
def reset_metadata_on_selected_repositories_in_tool_shed( self, trans, **kwd ):
if 'reset_metadata_on_selected_repositories_button' in kwd:
- message, status = metadata_util.reset_metadata_on_selected_repositories( trans.app, trans.user, **kwd )
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
+ message, status = rmm.reset_metadata_on_selected_repositories( trans.user, **kwd )
else:
message = util.restore_text( kwd.get( 'message', '' ) )
status = kwd.get( 'status', 'done' )
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/galaxy/webapps/tool_shed/controllers/hg.py
--- a/lib/galaxy/webapps/tool_shed/controllers/hg.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/hg.py
@@ -3,7 +3,7 @@
from galaxy.web.base.controller import BaseUIController
from tool_shed.util.shed_util_common import get_repository_by_name_and_owner
from tool_shed.util.hg_util import update_repository
-from tool_shed.util.metadata_util import set_repository_metadata
+from tool_shed.metadata import repository_metadata_manager
from galaxy import eggs
eggs.require('mercurial')
@@ -28,9 +28,11 @@
return hgwebapp
wsgi_app = wsgiapplication( make_web_app )
if hg_version >= '2.2.3' and cmd == 'pushkey':
- # When doing an "hg push" from the command line, the following commands, in order, will be retrieved from environ, depending
- # upon the mercurial version being used. In mercurial version 2.2.3, section 15.2. Command changes includes a new feature:
- # pushkey: add hooks for pushkey/listkeys (see http://mercurial.selenic.com/wiki/WhatsNew#Mercurial_2.2.3_.282012-07-01.29)
+ # When doing an "hg push" from the command line, the following commands, in order, will be
+ # retrieved from environ, depending upon the mercurial version being used. In mercurial
+ # version 2.2.3, section 15.2. Command changes includes a new feature:
+ # pushkey: add hooks for pushkey/listkeys
+ # (see http://mercurial.selenic.com/wiki/WhatsNew#Mercurial_2.2.3_.282012-07-01.29)
# We require version 2.2.3 since the pushkey hook was added in that version.
# If mercurial version >= '2.2.3': capabilities -> batch -> branchmap -> unbundle -> listkeys -> pushkey
path_info = kwd.get( 'path_info', None )
@@ -39,13 +41,15 @@
repository = get_repository_by_name_and_owner( trans.app, name, owner )
if repository:
if hg_version >= '2.2.3':
- # Update the repository on disk to the tip revision, because the web upload form uses the on-disk working
- # directory. If the repository is not updated on disk, pushing from the command line and then uploading
- # via the web interface will result in a new head being created.
+ # Update the repository on disk to the tip revision, because the web upload
+ # form uses the on-disk working directory. If the repository is not updated
+ # on disk, pushing from the command line and then uploading via the web
+ # interface will result in a new head being created.
repo = hg.repository( ui.ui(), repository.repo_path( trans.app ) )
update_repository( repo, ctx_rev=None )
# Set metadata using the repository files on disk.
- error_message, status = set_repository_metadata( trans.app, trans.request.host, trans.user, repository )
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
+ error_message, status = rmm.set_repository_metadata( trans.request.host, trans.user, repository )
if status == 'ok' and error_message:
log.debug( "Successfully reset metadata on repository %s owned by %s, but encountered problem: %s" % \
( str( repository.name ), str( repository.user.username ), error_message ) )
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/galaxy/webapps/tool_shed/controllers/repository.py
--- a/lib/galaxy/webapps/tool_shed/controllers/repository.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/repository.py
@@ -19,6 +19,7 @@
from tool_shed.dependencies.repository import relation_builder
from tool_shed.galaxy_install import dependency_display
+from tool_shed.metadata import repository_metadata_manager
from tool_shed.util import basic_util
from tool_shed.util import common_util
@@ -2610,14 +2611,20 @@
repository_dependencies_dict = metadata[ 'repository_dependencies' ]
rd_tups = repository_dependencies_dict.get( 'repository_dependencies', [] )
for rd_tup in rd_tups:
- rdtool_shed, rd_name, rd_owner, rd_changeset_revision, rd_prior_installation_required, rd_only_if_compiling_contained_td = \
+ rdtool_shed, \
+ rd_name, \
+ rd_owner, \
+ rd_changeset_revision, \
+ rd_prior_installation_required, \
+ rd_only_if_compiling_contained_td = \
common_util.parse_repository_dependency_tuple( rd_tup )
if not util.asbool( rd_only_if_compiling_contained_td ):
invalid = True
break
if invalid:
- message = metadata_util.generate_message_for_invalid_repository_dependencies( metadata,
- error_from_tuple=False )
+ dd = dependency_display.DependencyDisplayer( trans.app )
+ message = dd.generate_message_for_invalid_repository_dependencies( metadata,
+ error_from_tuple=False )
status = 'error'
else:
repository_metadata_id = None
@@ -2735,11 +2742,15 @@
def reset_all_metadata( self, trans, id, **kwd ):
"""Reset all metadata on the complete changelog for a single repository in the tool shed."""
# This method is called only from the ~/templates/webapps/tool_shed/repository/manage_repository.mako template.
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
invalid_file_tups, metadata_dict = \
- metadata_util.reset_all_metadata_on_repository_in_tool_shed( trans.app, trans.user, id, **kwd )
+ rmm.reset_all_metadata_on_repository_in_tool_shed( trans.user, id, **kwd )
if invalid_file_tups:
repository = suc.get_repository_in_tool_shed( trans.app, id )
- message = tool_util.generate_message_for_invalid_tools( trans.app, invalid_file_tups, repository, metadata_dict )
+ message = tool_util.generate_message_for_invalid_tools( trans.app,
+ invalid_file_tups,
+ repository,
+ metadata_dict )
status = 'error'
else:
message = "All repository metadata has been reset. "
@@ -2753,7 +2764,8 @@
@web.expose
def reset_metadata_on_my_writable_repositories_in_tool_shed( self, trans, **kwd ):
if 'reset_metadata_on_selected_repositories_button' in kwd:
- message, status = metadata_util.reset_metadata_on_selected_repositories( trans.app, trans.user, **kwd )
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
+ message, status = rmm.reset_metadata_on_selected_repositories( trans.user, **kwd )
else:
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
@@ -2817,11 +2829,11 @@
if tip == repository.tip( trans.app ):
message += 'No changes to repository. '
else:
- status, error_message = metadata_util.set_repository_metadata_due_to_new_tip( trans.app,
- trans.request.host,
- trans.user,
- repository,
- **kwd )
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
+ status, error_message = rmm.set_repository_metadata_due_to_new_tip( trans.request.host,
+ trans.user,
+ repository,
+ **kwd )
if error_message:
message = error_message
else:
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -12,12 +12,12 @@
from tool_shed.dependencies import attribute_handlers
from tool_shed.galaxy_install import dependency_display
+from tool_shed.metadata import repository_metadata_manager
import tool_shed.repository_types.util as rt_util
from tool_shed.util import basic_util
from tool_shed.util import commit_util
from tool_shed.util import hg_util
-from tool_shed.util import metadata_util
from tool_shed.util import shed_util_common as suc
from tool_shed.util import tool_util
from tool_shed.util import xml_util
@@ -262,12 +262,13 @@
( len( files_to_remove ), upload_point )
else:
message += " %d files were removed from the repository root. " % len( files_to_remove )
- status, error_message = metadata_util.set_repository_metadata_due_to_new_tip( trans.app,
- trans.request.host,
- trans.user,
- repository,
- content_alert_str=content_alert_str,
- **kwd )
+ rmm = repository_metadata_manager.RepositoryMetadataManager( trans.app )
+ status, error_message = \
+ rmm.set_repository_metadata_due_to_new_tip( trans.request.host,
+ trans.user,
+ repository,
+ content_alert_str=content_alert_str,
+ **kwd )
if error_message:
message = error_message
kwd[ 'message' ] = message
@@ -303,8 +304,8 @@
status = 'error'
# Handle messaging for invalid repository dependencies.
invalid_repository_dependencies_message = \
- metadata_util.generate_message_for_invalid_repository_dependencies( metadata_dict,
- error_from_tuple=True )
+ dd.generate_message_for_invalid_repository_dependencies( metadata_dict,
+ error_from_tuple=True )
if invalid_repository_dependencies_message:
message += invalid_repository_dependencies_message
status = 'error'
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/tool_shed/capsule/capsule_manager.py
--- a/lib/tool_shed/capsule/capsule_manager.py
+++ b/lib/tool_shed/capsule/capsule_manager.py
@@ -7,23 +7,26 @@
import urllib
from time import gmtime
from time import strftime
-import tool_shed.repository_types.util as rt_util
+
from galaxy import web
from galaxy.util import asbool
from galaxy.util import CHUNK_SIZE
from galaxy.util.odict import odict
+
from tool_shed.dependencies.repository.relation_builder import RelationBuilder
from tool_shed.dependencies import attribute_handlers
+from tool_shed.galaxy_install.repository_dependencies.repository_dependency_manager import RepositoryDependencyInstallManager
+from tool_shed.metadata import repository_metadata_manager
+import tool_shed.repository_types.util as rt_util
+
from tool_shed.util import basic_util
from tool_shed.util import commit_util
from tool_shed.util import common_util
from tool_shed.util import encoding_util
from tool_shed.util import hg_util
-from tool_shed.util import metadata_util
from tool_shed.util import repository_maintenance_util
from tool_shed.util import shed_util_common as suc
from tool_shed.util import xml_util
-from tool_shed.galaxy_install.repository_dependencies.repository_dependency_manager import RepositoryDependencyInstallManager
log = logging.getLogger( __name__ )
@@ -33,6 +36,7 @@
def __init__( self ):
self.exported_repository_elems = []
+
class ExportRepositoryManager( object ):
def __init__( self, app, user, tool_shed_url, repository, changeset_revision, export_repository_dependencies, using_api ):
@@ -723,11 +727,11 @@
results_dict[ 'ok' ] = False
results_dict[ 'error_message' ] += error_message
try:
- status, error_message = metadata_util.set_repository_metadata_due_to_new_tip( self.app,
- self.host,
- self.user,
- repository,
- content_alert_str=content_alert_str )
+ rmm = repository_metadata_manager.RepositoryMetadataManager( self.app )
+ status, error_message = rmm.set_repository_metadata_due_to_new_tip( self.host,
+ self.user,
+ repository,
+ content_alert_str=content_alert_str )
if error_message:
results_dict[ 'ok' ] = False
results_dict[ 'error_message' ] += error_message
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/tool_shed/galaxy_install/dependency_display.py
--- a/lib/tool_shed/galaxy_install/dependency_display.py
+++ b/lib/tool_shed/galaxy_install/dependency_display.py
@@ -48,6 +48,57 @@
tool_dependencies[ dependency_key ] = requirements_dict
return tool_dependencies
+ def generate_message_for_invalid_repository_dependencies( self, metadata_dict, error_from_tuple=False ):
+ """
+ Get or generate and return an error message associated with an invalid repository dependency.
+ """
+ message = ''
+ if metadata_dict:
+ if error_from_tuple:
+ # Return the error messages associated with a set of one or more invalid repository
+ # dependency tuples.
+ invalid_repository_dependencies_dict = metadata_dict.get( 'invalid_repository_dependencies', None )
+ if invalid_repository_dependencies_dict is not None:
+ invalid_repository_dependencies = \
+ invalid_repository_dependencies_dict.get( 'invalid_repository_dependencies', [] )
+ for repository_dependency_tup in invalid_repository_dependencies:
+ toolshed, \
+ name, \
+ owner, \
+ changeset_revision, \
+ prior_installation_required, \
+ only_if_compiling_contained_td, error = \
+ common_util.parse_repository_dependency_tuple( repository_dependency_tup, contains_error=True )
+ if error:
+ message += '%s ' % str( error )
+ else:
+ # The complete dependency hierarchy could not be determined for a repository being installed into
+ # Galaxy. This is likely due to invalid repository dependency definitions, so we'll get them from
+ # the metadata and parse them for display in an error message. This will hopefully communicate the
+ # problem to the user in such a way that a resolution can be determined.
+ message += 'The complete dependency hierarchy could not be determined for this repository, so no required '
+ message += 'repositories will not be installed. This is likely due to invalid repository dependency definitions. '
+ repository_dependencies_dict = metadata_dict.get( 'repository_dependencies', None )
+ if repository_dependencies_dict is not None:
+ rd_tups = repository_dependencies_dict.get( 'repository_dependencies', None )
+ if rd_tups is not None:
+ message += 'Here are the attributes of the dependencies defined for this repository to help determine the '
+ message += 'cause of this problem.<br/>'
+ message += '<table cellpadding="2" cellspacing="2">'
+ message += '<tr><th>Tool shed</th><th>Repository name</th><th>Owner</th><th>Changeset revision</th>'
+ message += '<th>Prior install required</th></tr>'
+ for rd_tup in rd_tups:
+ tool_shed, name, owner, changeset_revision, pir, oicct = \
+ common_util.parse_repository_dependency_tuple( rd_tup )
+ if util.asbool( pir ):
+ pir_str = 'True'
+ else:
+ pir_str = ''
+ message += '<tr><td>%s</td><td>%s</td><td>%s</td><td>%s</td><td>%s</td></tr>' % \
+ ( tool_shed, name, owner, changeset_revision, pir_str )
+ message += '</table>'
+ return message
+
def generate_message_for_invalid_tool_dependencies( self, metadata_dict ):
"""
Tool dependency definitions can only be invalid if they include a definition for a complex
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/tool_shed/galaxy_install/install_manager.py
--- a/lib/tool_shed/galaxy_install/install_manager.py
+++ b/lib/tool_shed/galaxy_install/install_manager.py
@@ -22,17 +22,17 @@
from tool_shed.util import datatype_util
from tool_shed.util import encoding_util
from tool_shed.util import hg_util
-from tool_shed.util import metadata_util
from tool_shed.util import shed_util_common as suc
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
from tool_shed.util import xml_util
+from tool_shed.galaxy_install.metadata.installed_repository_metadata_manager import InstalledRepositoryMetadataManager
+from tool_shed.galaxy_install.repository_dependencies import repository_dependency_manager
from tool_shed.galaxy_install.tool_dependencies.recipe.env_file_builder import EnvFileBuilder
from tool_shed.galaxy_install.tool_dependencies.recipe.install_environment import InstallEnvironment
from tool_shed.galaxy_install.tool_dependencies.recipe.recipe_manager import StepManager
from tool_shed.galaxy_install.tool_dependencies.recipe.recipe_manager import TagManager
-from tool_shed.galaxy_install.repository_dependencies import repository_dependency_manager
log = logging.getLogger( __name__ )
@@ -499,17 +499,17 @@
"""
install_model = self.app.install_model
shed_config_dict = self.app.toolbox.get_shed_config_dict_by_filename( shed_tool_conf )
+ irmm = InstalledRepositoryMetadataManager( self.app )
metadata_dict, invalid_file_tups = \
- metadata_util.generate_metadata_for_changeset_revision( app=self.app,
- repository=tool_shed_repository,
- changeset_revision=tool_shed_repository.changeset_revision,
- repository_clone_url=repository_clone_url,
- shed_config_dict=shed_config_dict,
- relative_install_dir=relative_install_dir,
- repository_files_dir=None,
- resetting_all_metadata_on_repository=False,
- updating_installed_repository=False,
- persist=True )
+ irmm.generate_metadata_for_changeset_revision( repository=tool_shed_repository,
+ changeset_revision=tool_shed_repository.changeset_revision,
+ repository_clone_url=repository_clone_url,
+ shed_config_dict=shed_config_dict,
+ relative_install_dir=relative_install_dir,
+ repository_files_dir=None,
+ resetting_all_metadata_on_repository=False,
+ updating_installed_repository=False,
+ persist=True )
tool_shed_repository.metadata = metadata_dict
# Update the tool_shed_repository.tool_shed_status column in the database.
tool_shed_status_dict = suc.get_tool_shed_status_for_installed_repository( self.app, tool_shed_repository )
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/tool_shed/galaxy_install/metadata/installed_repository_metadata_manager.py
--- /dev/null
+++ b/lib/tool_shed/galaxy_install/metadata/installed_repository_metadata_manager.py
@@ -0,0 +1,92 @@
+import logging
+import os
+
+from galaxy import util
+from galaxy.util import inflector
+
+from tool_shed.metadata import metadata_generator
+
+from tool_shed.util import common_util
+from tool_shed.util import shed_util_common as suc
+from tool_shed.util import tool_util
+
+log = logging.getLogger( __name__ )
+
+
+class InstalledRepositoryMetadataManager( metadata_generator.MetadataGenerator ):
+
+ def __init__( self, app ):
+ super( InstalledRepositoryMetadataManager, self ).__init__( app )
+ self.app = app
+
+ def reset_all_metadata_on_installed_repository( self, id ):
+ """Reset all metadata on a single tool shed repository installed into a Galaxy instance."""
+ invalid_file_tups = []
+ metadata_dict = {}
+ repository = suc.get_installed_tool_shed_repository( self.app, id )
+ repository_clone_url = common_util.generate_clone_url_for_installed_repository( self.app, repository )
+ tool_path, relative_install_dir = repository.get_tool_relative_path( self.app )
+ if relative_install_dir:
+ original_metadata_dict = repository.metadata
+ metadata_dict, invalid_file_tups = \
+ self.generate_metadata_for_changeset_revision( repository=repository,
+ changeset_revision=repository.changeset_revision,
+ repository_clone_url=repository_clone_url,
+ shed_config_dict = repository.get_shed_config_dict( self.app ),
+ relative_install_dir=relative_install_dir,
+ repository_files_dir=None,
+ resetting_all_metadata_on_repository=False,
+ updating_installed_repository=False,
+ persist=False )
+ repository.metadata = metadata_dict
+ if metadata_dict != original_metadata_dict:
+ suc.update_in_shed_tool_config( self.app, repository )
+ self.app.install_model.context.add( repository )
+ self.app.install_model.context.flush()
+ log.debug( 'Metadata has been reset on repository %s.' % repository.name )
+ else:
+ log.debug( 'Metadata did not need to be reset on repository %s.' % repository.name )
+ else:
+ log.debug( 'Error locating installation directory for repository %s.' % repository.name )
+ return invalid_file_tups, metadata_dict
+
+ def reset_metadata_on_selected_repositories( self, user, **kwd ):
+ """
+ Inspect the repository changelog to reset metadata for all appropriate changeset revisions.
+ This method is called from both Galaxy and the Tool Shed.
+ """
+ repository_ids = util.listify( kwd.get( 'repository_ids', None ) )
+ message = ''
+ status = 'done'
+ if repository_ids:
+ successful_count = 0
+ unsuccessful_count = 0
+ for repository_id in repository_ids:
+ try:
+ repository = suc.get_installed_tool_shed_repository( self.app, repository_id )
+ owner = str( repository.owner )
+ invalid_file_tups, metadata_dict = \
+ self.reset_all_metadata_on_installed_repository( repository_id )
+ if invalid_file_tups:
+ message = tool_util.generate_message_for_invalid_tools( self.app,
+ invalid_file_tups,
+ repository,
+ None,
+ as_html=False )
+ log.debug( message )
+ unsuccessful_count += 1
+ else:
+ log.debug( "Successfully reset metadata on repository %s owned by %s" % ( str( repository.name ), owner ) )
+ successful_count += 1
+ except:
+ log.exception( "Error attempting to reset metadata on repository %s", str( repository.name ) )
+ unsuccessful_count += 1
+ message = "Successfully reset metadata on %d %s. " % \
+ ( successful_count, inflector.cond_plural( successful_count, "repository" ) )
+ if unsuccessful_count:
+ message += "Error setting metadata on %d %s - see the paster log for details. " % \
+ ( unsuccessful_count, inflector.cond_plural( unsuccessful_count, "repository" ) )
+ else:
+ message = 'Select at least one repository to on which to reset all metadata.'
+ status = 'error'
+ return message, status
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/tool_shed/galaxy_install/tool_migration_manager.py
--- a/lib/tool_shed/galaxy_install/tool_migration_manager.py
+++ b/lib/tool_shed/galaxy_install/tool_migration_manager.py
@@ -8,15 +8,18 @@
import tempfile
import threading
import logging
+
from galaxy import util
from galaxy.tools import ToolSection
+
from tool_shed.galaxy_install import install_manager
-import tool_shed.util.shed_util_common as suc
+from tool_shed.galaxy_install.metadata.installed_repository_metadata_manager import InstalledRepositoryMetadataManager
+
from tool_shed.util import basic_util
from tool_shed.util import common_util
from tool_shed.util import datatype_util
from tool_shed.util import hg_util
-from tool_shed.util import metadata_util
+from tool_shed.util import shed_util_common as suc
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
from tool_shed.util import xml_util
@@ -381,17 +384,17 @@
log.exception( "Exception attempting to filter and persist non-shed-related tool panel configs:\n%s" % str( e ) )
finally:
lock.release()
+ irmm = InstalledRepositoryMetadataManager( self.app )
metadata_dict, invalid_file_tups = \
- metadata_util.generate_metadata_for_changeset_revision( app=self.app,
- repository=tool_shed_repository,
- changeset_revision=tool_shed_repository.changeset_revision,
- repository_clone_url=repository_clone_url,
- shed_config_dict = self.shed_config_dict,
- relative_install_dir=relative_install_dir,
- repository_files_dir=None,
- resetting_all_metadata_on_repository=False,
- updating_installed_repository=False,
- persist=True )
+ irmm.generate_metadata_for_changeset_revision( repository=tool_shed_repository,
+ changeset_revision=tool_shed_repository.changeset_revision,
+ repository_clone_url=repository_clone_url,
+ shed_config_dict = self.shed_config_dict,
+ relative_install_dir=relative_install_dir,
+ repository_files_dir=None,
+ resetting_all_metadata_on_repository=False,
+ updating_installed_repository=False,
+ persist=True )
tool_shed_repository.metadata = metadata_dict
self.app.install_model.context.add( tool_shed_repository )
self.app.install_model.context.flush()
diff -r 5af80141674f5ac3166b6782385c408258606d0a -r 4d2e744f8efc73dc265c3d3bf33c4c2886888b7a lib/tool_shed/metadata/metadata_generator.py
--- /dev/null
+++ b/lib/tool_shed/metadata/metadata_generator.py
@@ -0,0 +1,1038 @@
+import json
+import logging
+import os
+import tempfile
+
+from galaxy import util
+from galaxy.datatypes import checkers
+from galaxy.model.orm import and_
+from galaxy.tools.data_manager.manager import DataManager
+from galaxy.web import url_for
+
+from tool_shed.repository_types import util as rt_util
+
+from tool_shed.util import basic_util
+from tool_shed.util import common_util
+from tool_shed.util import hg_util
+from tool_shed.util import readme_util
+from tool_shed.util import shed_util_common as suc
+from tool_shed.util import tool_dependency_util
+from tool_shed.util import tool_util
+from tool_shed.util import xml_util
+
+log = logging.getLogger( __name__ )
+
+
+class MetadataGenerator( object ):
+
+ def __init__( self, app ):
+ self.app = app
+ self.sa_session = app.model.context.current
+ self.NOT_TOOL_CONFIGS = [ suc.DATATYPES_CONFIG_FILENAME,
+ rt_util.REPOSITORY_DEPENDENCY_DEFINITION_FILENAME,
+ rt_util.TOOL_DEPENDENCY_DEFINITION_FILENAME,
+ suc.REPOSITORY_DATA_MANAGER_CONFIG_FILENAME ]
+
+ def generate_data_manager_metadata( self, repository, repo_dir, data_manager_config_filename, metadata_dict,
+ shed_config_dict=None ):
+ """
+ Update the received metadata_dict with information from the parsed data_manager_config_filename.
+ """
+ if data_manager_config_filename is None:
+ return metadata_dict
+ repo_path = repository.repo_path( self.app )
+ try:
+ # Galaxy Side.
+ repo_files_directory = repository.repo_files_directory( self.app )
+ repo_dir = repo_files_directory
+ repository_clone_url = common_util.generate_clone_url_for_installed_repository( self.app, repository )
+ except AttributeError:
+ # Tool Shed side.
+ repo_files_directory = repo_path
+ repository_clone_url = common_util.generate_clone_url_for_repository_in_tool_shed( None, repository )
+ relative_data_manager_dir = util.relpath( os.path.split( data_manager_config_filename )[0], repo_dir )
+ rel_data_manager_config_filename = os.path.join( relative_data_manager_dir,
+ os.path.split( data_manager_config_filename )[1] )
+ data_managers = {}
+ invalid_data_managers = []
+ data_manager_metadata = { 'config_filename': rel_data_manager_config_filename,
+ 'data_managers': data_managers,
+ 'invalid_data_managers': invalid_data_managers,
+ 'error_messages': [] }
+ metadata_dict[ 'data_manager' ] = data_manager_metadata
+ tree, error_message = xml_util.parse_xml( data_manager_config_filename )
+ if tree is None:
+ # We are not able to load any data managers.
+ data_manager_metadata[ 'error_messages' ].append( error_message )
+ return metadata_dict
+ tool_path = None
+ if shed_config_dict:
+ tool_path = shed_config_dict.get( 'tool_path', None )
+ tools = {}
+ for tool in metadata_dict.get( 'tools', [] ):
+ tool_conf_name = tool[ 'tool_config' ]
+ if tool_path:
+ tool_conf_name = os.path.join( tool_path, tool_conf_name )
+ tools[ tool_conf_name ] = tool
+ root = tree.getroot()
+ data_manager_tool_path = root.get( 'tool_path', None )
+ if data_manager_tool_path:
+ relative_data_manager_dir = os.path.join( relative_data_manager_dir, data_manager_tool_path )
+ for i, data_manager_elem in enumerate( root.findall( 'data_manager' ) ):
+ tool_file = data_manager_elem.get( 'tool_file', None )
+ data_manager_id = data_manager_elem.get( 'id', None )
+ if data_manager_id is None:
+ log.error( 'Data Manager entry is missing id attribute in "%s".' % ( data_manager_config_filename ) )
+ invalid_data_managers.append( { 'index': i,
+ 'error_message': 'Data Manager entry is missing id attribute' } )
+ continue
+ # FIXME: default behavior is to fall back to tool.name.
+ data_manager_name = data_manager_elem.get( 'name', data_manager_id )
+ version = data_manager_elem.get( 'version', DataManager.DEFAULT_VERSION )
+ guid = self.generate_guid_for_object( repository_clone_url, DataManager.GUID_TYPE, data_manager_id, version )
+ data_tables = []
+ if tool_file is None:
+ log.error( 'Data Manager entry is missing tool_file attribute in "%s".' % ( data_manager_config_filename ) )
+ invalid_data_managers.append( { 'index': i,
+ 'error_message': 'Data Manager entry is missing tool_file attribute' } )
+ continue
+ else:
+ bad_data_table = False
+ for data_table_elem in data_manager_elem.findall( 'data_table' ):
+ data_table_name = data_table_elem.get( 'name', None )
+ if data_table_name is None:
+ log.error( 'Data Manager data_table entry is missing name attribute in "%s".' % ( data_manager_config_filename ) )
+ invalid_data_managers.append( { 'index': i,
+ 'error_message': 'Data Manager entry is missing name attribute' } )
+ bad_data_table = True
+ break
+ else:
+ data_tables.append( data_table_name )
+ if bad_data_table:
+ continue
+ data_manager_metadata_tool_file = os.path.normpath( os.path.join( relative_data_manager_dir, tool_file ) )
+ tool_metadata_tool_file = os.path.join( repo_files_directory, data_manager_metadata_tool_file )
+ tool = tools.get( tool_metadata_tool_file, None )
+ if tool is None:
+ log.error( "Unable to determine tools metadata for '%s'." % ( data_manager_metadata_tool_file ) )
+ invalid_data_managers.append( { 'index': i,
+ 'error_message': 'Unable to determine tools metadata' } )
+ continue
+ data_managers[ data_manager_id ] = { 'id': data_manager_id,
+ 'name': data_manager_name,
+ 'guid': guid,
+ 'version': version,
+ 'tool_config_file': data_manager_metadata_tool_file,
+ 'data_tables': data_tables,
+ 'tool_guid': tool[ 'guid' ] }
+ log.debug( 'Loaded Data Manager tool_files: %s' % ( tool_file ) )
+ return metadata_dict
+
+ def generate_datatypes_metadata( self, repository, repository_clone_url, repository_files_dir, datatypes_config,
+ metadata_dict ):
+ """Update the received metadata_dict with information from the parsed datatypes_config."""
+ tree, error_message = xml_util.parse_xml( datatypes_config )
+ if tree is None:
+ return metadata_dict
+ root = tree.getroot()
+ repository_datatype_code_files = []
+ datatype_files = root.find( 'datatype_files' )
+ if datatype_files is not None:
+ for elem in datatype_files.findall( 'datatype_file' ):
+ name = elem.get( 'name', None )
+ repository_datatype_code_files.append( name )
+ metadata_dict[ 'datatype_files' ] = repository_datatype_code_files
+ datatypes = []
+ registration = root.find( 'registration' )
+ if registration is not None:
+ for elem in registration.findall( 'datatype' ):
+ converters = []
+ display_app_containers = []
+ datatypes_dict = {}
+ # Handle defined datatype attributes.
+ display_in_upload = elem.get( 'display_in_upload', None )
+ if display_in_upload:
+ datatypes_dict[ 'display_in_upload' ] = display_in_upload
+ dtype = elem.get( 'type', None )
+ if dtype:
+ datatypes_dict[ 'dtype' ] = dtype
+ extension = elem.get( 'extension', None )
+ if extension:
+ datatypes_dict[ 'extension' ] = extension
+ max_optional_metadata_filesize = elem.get( 'max_optional_metadata_filesize', None )
+ if max_optional_metadata_filesize:
+ datatypes_dict[ 'max_optional_metadata_filesize' ] = max_optional_metadata_filesize
+ mimetype = elem.get( 'mimetype', None )
+ if mimetype:
+ datatypes_dict[ 'mimetype' ] = mimetype
+ subclass = elem.get( 'subclass', None )
+ if subclass:
+ datatypes_dict[ 'subclass' ] = subclass
+ # Handle defined datatype converters and display applications.
+ for sub_elem in elem:
+ if sub_elem.tag == 'converter':
+ # <converter file="bed_to_gff_converter.xml" target_datatype="gff"/>
+ tool_config = sub_elem.attrib[ 'file' ]
+ target_datatype = sub_elem.attrib[ 'target_datatype' ]
+ # Parse the tool_config to get the guid.
+ tool_config_path = hg_util.get_config_from_disk( tool_config, repository_files_dir )
+ full_path = os.path.abspath( tool_config_path )
+ tool, valid, error_message = \
+ tool_util.load_tool_from_config( self.app, self.app.security.encode_id( repository.id ), full_path )
+ if tool is None:
+ guid = None
+ else:
+ guid = suc.generate_tool_guid( repository_clone_url, tool )
+ converter_dict = dict( tool_config=tool_config,
+ guid=guid,
+ target_datatype=target_datatype )
+ converters.append( converter_dict )
+ elif sub_elem.tag == 'display':
+ # <display file="ucsc/bigwig.xml" />
+ # Should we store more than this?
+ display_file = sub_elem.attrib[ 'file' ]
+ display_app_dict = dict( display_file=display_file )
+ display_app_containers.append( display_app_dict )
+ if converters:
+ datatypes_dict[ 'converters' ] = converters
+ if display_app_containers:
+ datatypes_dict[ 'display_app_containers' ] = display_app_containers
+ if datatypes_dict:
+ datatypes.append( datatypes_dict )
+ if datatypes:
+ metadata_dict[ 'datatypes' ] = datatypes
+ return metadata_dict
+
+ def generate_environment_dependency_metadata( self, elem, valid_tool_dependencies_dict ):
+ """
+ The value of env_var_name must match the value of the "set_environment" type
+ in the tool config's <requirements> tag set, or the tool dependency will be
+ considered an orphan.
+ """
+ # The value of the received elem looks something like this:
+ # <set_environment version="1.0">
+ # <environment_variable name="JAVA_JAR_PATH" action="set_to">$INSTALL_DIR</environment_variable>
+ # </set_environment>
+ for env_elem in elem:
+ # <environment_variable name="JAVA_JAR_PATH" action="set_to">$INSTALL_DIR</environment_variable>
+ env_name = env_elem.get( 'name', None )
+ if env_name:
+ requirements_dict = dict( name=env_name, type='set_environment' )
+ if 'set_environment' in valid_tool_dependencies_dict:
+ valid_tool_dependencies_dict[ 'set_environment' ].append( requirements_dict )
+ else:
+ valid_tool_dependencies_dict[ 'set_environment' ] = [ requirements_dict ]
+ return valid_tool_dependencies_dict
+
+ def generate_guid_for_object( self, repository_clone_url, guid_type, obj_id, version ):
+ tmp_url = common_util.remove_protocol_and_user_from_clone_url( repository_clone_url )
+ return '%s/%s/%s/%s' % ( tmp_url, guid_type, obj_id, version )
+
+ def generate_metadata_for_changeset_revision( self, repository, changeset_revision, repository_clone_url,
+ shed_config_dict=None, relative_install_dir=None, repository_files_dir=None,
+ resetting_all_metadata_on_repository=False, updating_installed_repository=False,
+ persist=False ):
+ """
+ Generate metadata for a repository using its files on disk. To generate metadata
+ for changeset revisions older than the repository tip, the repository will have been
+ cloned to a temporary location and updated to a specified changeset revision to access
+ that changeset revision's disk files, so the value of repository_files_dir will not
+ always be repository.repo_path( self.app ) (it could be an absolute path to a temporary
+ directory containing a clone). If it is an absolute path, the value of relative_install_dir
+ must contain repository.repo_path( self.app ).
+
+ The value of persist will be True when the installed repository contains a valid
+ tool_data_table_conf.xml.sample file, in which case the entries should ultimately be
+ persisted to the file referred to by self.app.config.shed_tool_data_table_config.
+ """
+ if shed_config_dict is None:
+ shed_config_dict = {}
+ if updating_installed_repository:
+ # Keep the original tool shed repository metadata if setting metadata on a repository
+ # installed into a local Galaxy instance for which we have pulled updates.
+ original_repository_metadata = repository.metadata
+ else:
+ original_repository_metadata = None
+ readme_file_names = readme_util.get_readme_file_names( str( repository.name ) )
+ if self.app.name == 'galaxy':
+ # Shed related tool panel configs are only relevant to Galaxy.
+ metadata_dict = { 'shed_config_filename' : shed_config_dict.get( 'config_filename' ) }
+ else:
+ metadata_dict = {}
+ readme_files = []
+ invalid_file_tups = []
+ invalid_tool_configs = []
+ tool_dependencies_config = None
+ original_tool_data_path = self.app.config.tool_data_path
+ original_tool_data_table_config_path = self.app.config.tool_data_table_config_path
+ if resetting_all_metadata_on_repository:
+ if not relative_install_dir:
+ raise Exception( "The value of repository.repo_path must be sent when resetting all metadata on a repository." )
+ # Keep track of the location where the repository is temporarily cloned so that we can
+ # strip the path when setting metadata. The value of repository_files_dir is the full
+ # path to the temporary directory to which the repository was cloned.
+ work_dir = repository_files_dir
+ files_dir = repository_files_dir
+ # Since we're working from a temporary directory, we can safely copy sample files included
+ # in the repository to the repository root.
+ self.app.config.tool_data_path = repository_files_dir
+ self.app.config.tool_data_table_config_path = repository_files_dir
+ else:
+ # Use a temporary working directory to copy all sample files.
+ work_dir = tempfile.mkdtemp( prefix="tmp-toolshed-gmfcr" )
+ # All other files are on disk in the repository's repo_path, which is the value of
+ # relative_install_dir.
+ files_dir = relative_install_dir
+ if shed_config_dict.get( 'tool_path' ):
+ files_dir = os.path.join( shed_config_dict[ 'tool_path' ], files_dir )
+ self.app.config.tool_data_path = work_dir #FIXME: Thread safe?
+ self.app.config.tool_data_table_config_path = work_dir
+ # Handle proprietary datatypes, if any.
+ datatypes_config = hg_util.get_config_from_disk( suc.DATATYPES_CONFIG_FILENAME, files_dir )
+ if datatypes_config:
+ metadata_dict = self.generate_datatypes_metadata( repository,
+ repository_clone_url,
+ files_dir,
+ datatypes_config,
+ metadata_dict )
+ # Get the relative path to all sample files included in the repository for storage in
+ # the repository's metadata.
+ sample_file_metadata_paths, sample_file_copy_paths = \
+ self.get_sample_files_from_disk( repository_files_dir=files_dir,
+ tool_path=shed_config_dict.get( 'tool_path' ),
+ relative_install_dir=relative_install_dir,
+ resetting_all_metadata_on_repository=resetting_all_metadata_on_repository )
+ if sample_file_metadata_paths:
+ metadata_dict[ 'sample_files' ] = sample_file_metadata_paths
+ # Copy all sample files included in the repository to a single directory location so we
+ # can load tools that depend on them.
+ for sample_file in sample_file_copy_paths:
+ tool_util.copy_sample_file( self.app, sample_file, dest_path=work_dir )
+ # If the list of sample files includes a tool_data_table_conf.xml.sample file, load
+ # its table elements into memory.
+ relative_path, filename = os.path.split( sample_file )
+ if filename == 'tool_data_table_conf.xml.sample':
+ new_table_elems, error_message = \
+ self.app.tool_data_tables.add_new_entries_from_config_file( config_filename=sample_file,
+ tool_data_path=self.app.config.tool_data_path,
+ shed_tool_data_table_config=self.app.config.shed_tool_data_table_config,
+ persist=False )
+ if error_message:
+ invalid_file_tups.append( ( filename, error_message ) )
+ for root, dirs, files in os.walk( files_dir ):
+ if root.find( '.hg' ) < 0 and root.find( 'hgrc' ) < 0:
+ if '.hg' in dirs:
+ dirs.remove( '.hg' )
+ for name in files:
+ # See if we have a repository dependencies defined.
+ if name == rt_util.REPOSITORY_DEPENDENCY_DEFINITION_FILENAME:
+ path_to_repository_dependencies_config = os.path.join( root, name )
+ metadata_dict, error_message = \
+ self.generate_repository_dependency_metadata( path_to_repository_dependencies_config,
+ metadata_dict,
+ updating_installed_repository=updating_installed_repository )
+ if error_message:
+ invalid_file_tups.append( ( name, error_message ) )
+ # See if we have one or more READ_ME files.
+ elif name.lower() in readme_file_names:
+ relative_path_to_readme = self.get_relative_path_to_repository_file( root,
+ name,
+ relative_install_dir,
+ work_dir,
+ shed_config_dict,
+ resetting_all_metadata_on_repository )
+ readme_files.append( relative_path_to_readme )
+ # See if we have a tool config.
+ elif name not in self.NOT_TOOL_CONFIGS and name.endswith( '.xml' ):
+ full_path = str( os.path.abspath( os.path.join( root, name ) ) )
+ if os.path.getsize( full_path ) > 0:
+ if not ( checkers.check_binary( full_path ) or
+ checkers.check_image( full_path ) or
+ checkers.check_gzip( full_path )[ 0 ] or
+ checkers.check_bz2( full_path )[ 0 ] or
+ checkers.check_zip( full_path ) ):
+ # Make sure we're looking at a tool config and not a display application
+ # config or something else.
+ element_tree, error_message = xml_util.parse_xml( full_path )
+ if element_tree is None:
+ is_tool = False
+ else:
+ element_tree_root = element_tree.getroot()
+ is_tool = element_tree_root.tag == 'tool'
+ if is_tool:
+ tool, valid, error_message = \
+ tool_util.load_tool_from_config( self.app,
+ self.app.security.encode_id( repository.id ),
+ full_path )
+ if tool is None:
+ if not valid:
+ invalid_tool_configs.append( name )
+ invalid_file_tups.append( ( name, error_message ) )
+ else:
+ invalid_files_and_errors_tups = \
+ tool_util.check_tool_input_params( self.app,
+ files_dir,
+ name,
+ tool,
+ sample_file_copy_paths )
+ can_set_metadata = True
+ for tup in invalid_files_and_errors_tups:
+ if name in tup:
+ can_set_metadata = False
+ invalid_tool_configs.append( name )
+ break
+ if can_set_metadata:
+ relative_path_to_tool_config = \
+ self.get_relative_path_to_repository_file( root,
+ name,
+ relative_install_dir,
+ work_dir,
+ shed_config_dict,
+ resetting_all_metadata_on_repository )
+ metadata_dict = self.generate_tool_metadata( relative_path_to_tool_config,
+ tool,
+ repository_clone_url,
+ metadata_dict )
+ else:
+ for tup in invalid_files_and_errors_tups:
+ invalid_file_tups.append( tup )
+ # Find all exported workflows.
+ elif name.endswith( '.ga' ):
+ relative_path = os.path.join( root, name )
+ if os.path.getsize( os.path.abspath( relative_path ) ) > 0:
+ fp = open( relative_path, 'rb' )
+ workflow_text = fp.read()
+ fp.close()
+ if workflow_text:
+ valid_exported_galaxy_workflow = True
+ try:
+ exported_workflow_dict = json.loads( workflow_text )
+ except Exception, e:
+ log.exception( "Skipping file %s since it does not seem to be a valid exported Galaxy workflow: %s" \
+ % str( relative_path ), str( e ) )
+ valid_exported_galaxy_workflow = False
+ if valid_exported_galaxy_workflow and \
+ 'a_galaxy_workflow' in exported_workflow_dict and \
+ exported_workflow_dict[ 'a_galaxy_workflow' ] == 'true':
+ metadata_dict = self.generate_workflow_metadata( relative_path,
+ exported_workflow_dict,
+ metadata_dict )
+ # Handle any data manager entries
+ data_manager_config = hg_util.get_config_from_disk( suc.REPOSITORY_DATA_MANAGER_CONFIG_FILENAME, files_dir )
+ metadata_dict = self.generate_data_manager_metadata( repository,
+ files_dir,
+ data_manager_config,
+ metadata_dict,
+ shed_config_dict=shed_config_dict )
+
+ if readme_files:
+ metadata_dict[ 'readme_files' ] = readme_files
+ # This step must be done after metadata for tools has been defined.
+ tool_dependencies_config = hg_util.get_config_from_disk( rt_util.TOOL_DEPENDENCY_DEFINITION_FILENAME, files_dir )
+ if tool_dependencies_config:
+ metadata_dict, error_message = \
+ self.generate_tool_dependency_metadata( repository,
+ changeset_revision,
+ repository_clone_url,
+ tool_dependencies_config,
+ metadata_dict,
+ original_repository_metadata=original_repository_metadata )
+ if error_message:
+ invalid_file_tups.append( ( rt_util.TOOL_DEPENDENCY_DEFINITION_FILENAME, error_message ) )
+ if invalid_tool_configs:
+ metadata_dict [ 'invalid_tools' ] = invalid_tool_configs
+ # Reset the value of the app's tool_data_path and tool_data_table_config_path to their respective original values.
+ self.app.config.tool_data_path = original_tool_data_path
+ self.app.config.tool_data_table_config_path = original_tool_data_table_config_path
+ basic_util.remove_dir( work_dir )
+ return metadata_dict, invalid_file_tups
+
+ def generate_package_dependency_metadata( self, elem, valid_tool_dependencies_dict, invalid_tool_dependencies_dict ):
+ """
+ Generate the metadata for a tool dependencies package defined for a repository. The
+ value of package_name must match the value of the "package" type in the tool config's
+ <requirements> tag set.
+ """
+ # TODO: make this function a class.
+ repository_dependency_is_valid = True
+ repository_dependency_tup = []
+ requirements_dict = {}
+ error_message = ''
+ package_name = elem.get( 'name', None )
+ package_version = elem.get( 'version', None )
+ if package_name and package_version:
+ requirements_dict[ 'name' ] = package_name
+ requirements_dict[ 'version' ] = package_version
+ requirements_dict[ 'type' ] = 'package'
+ for sub_elem in elem:
+ if sub_elem.tag == 'readme':
+ requirements_dict[ 'readme' ] = sub_elem.text
+ elif sub_elem.tag == 'repository':
+ # We have a complex repository dependency. If the returned value of repository_dependency_is_valid
+ # is True, the tool dependency definition will be set as invalid. This is currently the only case
+ # where a tool dependency definition is considered invalid.
+ repository_dependency_tup, repository_dependency_is_valid, error_message = \
+ self.handle_repository_elem( repository_elem=sub_elem,
+ only_if_compiling_contained_td=False,
+ updating_installed_repository=False )
+ elif sub_elem.tag == 'install':
+ package_install_version = sub_elem.get( 'version', '1.0' )
+ if package_install_version == '1.0':
+ # Complex repository dependencies can be defined within the last <actions> tag set contained in an
+ # <actions_group> tag set. Comments, <repository> tag sets and <readme> tag sets will be skipped
+ # in tool_dependency_util.parse_package_elem().
+ actions_elem_tuples = tool_dependency_util.parse_package_elem( sub_elem,
+ platform_info_dict=None,
+ include_after_install_actions=False )
+ if actions_elem_tuples:
+ # We now have a list of a single tuple that looks something like:
+ # [(True, <Element 'actions' at 0x104017850>)]
+ actions_elem_tuple = actions_elem_tuples[ 0 ]
+ in_actions_group, actions_elem = actions_elem_tuple
+ if in_actions_group:
+ # Since we're inside an <actions_group> tag set, inspect the actions_elem to see if a complex
+ # repository dependency is defined. By definition, complex repository dependency definitions
+ # contained within the last <actions> tag set within an <actions_group> tag set will have the
+ # value of "only_if_compiling_contained_td" set to True in
+ for action_elem in actions_elem:
+ if action_elem.tag == 'package':
+ # <package name="libgtextutils" version="0.6">
+ # <repository name="package_libgtextutils_0_6" owner="test" prior_installation_required="True" />
+ # </package>
+ ae_package_name = action_elem.get( 'name', None )
+ ae_package_version = action_elem.get( 'version', None )
+ if ae_package_name and ae_package_version:
+ for sub_action_elem in action_elem:
+ if sub_action_elem.tag == 'repository':
+ # We have a complex repository dependency.
+ repository_dependency_tup, repository_dependency_is_valid, error_message = \
+ self.handle_repository_elem( repository_elem=sub_action_elem,
+ only_if_compiling_contained_td=True,
+ updating_installed_repository=False )
+ elif action_elem.tag == 'action':
+ # <action type="set_environment_for_install">
+ # <repository changeset_revision="b107b91b3574" name="package_readline_6_2" owner="devteam" prior_installation_required="True" toolshed="http://localhost:9009">
+ # <package name="readline" version="6.2" />
+ # </repository>
+ # </action>
+ for sub_action_elem in action_elem:
+ if sub_action_elem.tag == 'repository':
+ # We have a complex repository dependency.
+ repository_dependency_tup, repository_dependency_is_valid, error_message = \
+ self.handle_repository_elem( repository_elem=sub_action_elem,
+ only_if_compiling_contained_td=True,
+ updating_installed_repository=False )
+ if requirements_dict:
+ dependency_key = '%s/%s' % ( package_name, package_version )
+ if repository_dependency_is_valid:
+ valid_tool_dependencies_dict[ dependency_key ] = requirements_dict
+ else:
+ # Append the error message to the requirements_dict.
+ requirements_dict[ 'error' ] = error_message
+ invalid_tool_dependencies_dict[ dependency_key ] = requirements_dict
+ return valid_tool_dependencies_dict, \
+ invalid_tool_dependencies_dict, \
+ repository_dependency_tup, \
+ repository_dependency_is_valid, \
+ error_message
+
+ def generate_repository_dependency_metadata( self, repository_dependencies_config, metadata_dict,
+ updating_installed_repository=False ):
+ """
+ Generate a repository dependencies dictionary based on valid information defined in the received
+ repository_dependencies_config. This method is called from the tool shed as well as from Galaxy.
+ """
+ error_message = ''
+ # Make sure we're looking at a valid repository_dependencies.xml file.
+ tree, error_message = xml_util.parse_xml( repository_dependencies_config )
+ if tree is None:
+ xml_is_valid = False
+ else:
+ root = tree.getroot()
+ xml_is_valid = root.tag == 'repositories'
+ if xml_is_valid:
+ invalid_repository_dependencies_dict = dict( description=root.get( 'description' ) )
+ invalid_repository_dependency_tups = []
+ valid_repository_dependencies_dict = dict( description=root.get( 'description' ) )
+ valid_repository_dependency_tups = []
+ for repository_elem in root.findall( 'repository' ):
+ repository_dependency_tup, repository_dependency_is_valid, err_msg = \
+ self.handle_repository_elem( repository_elem,
+ only_if_compiling_contained_td=False,
+ updating_installed_repository=updating_installed_repository )
+ if repository_dependency_is_valid:
+ valid_repository_dependency_tups.append( repository_dependency_tup )
+ else:
+ # Append the error_message to the repository dependencies tuple.
+ toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ repository_dependency_tup
+ repository_dependency_tup = ( toolshed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ only_if_compiling_contained_td,
+ err_msg )
+ invalid_repository_dependency_tups.append( repository_dependency_tup )
+ error_message += err_msg
+ if invalid_repository_dependency_tups:
+ invalid_repository_dependencies_dict[ 'repository_dependencies' ] = invalid_repository_dependency_tups
+ metadata_dict[ 'invalid_repository_dependencies' ] = invalid_repository_dependencies_dict
+ if valid_repository_dependency_tups:
+ valid_repository_dependencies_dict[ 'repository_dependencies' ] = valid_repository_dependency_tups
+ metadata_dict[ 'repository_dependencies' ] = valid_repository_dependencies_dict
+ return metadata_dict, error_message
+
+ def generate_tool_metadata( self, tool_config, tool, repository_clone_url, metadata_dict ):
+ """Update the received metadata_dict with changes that have been applied to the received tool."""
+ # Generate the guid.
+ guid = suc.generate_tool_guid( repository_clone_url, tool )
+ # Handle tool.requirements.
+ tool_requirements = []
+ for tool_requirement in tool.requirements:
+ name = str( tool_requirement.name )
+ type = str( tool_requirement.type )
+ version = str( tool_requirement.version ) if tool_requirement.version else None
+ requirement_dict = dict( name=name,
+ type=type,
+ version=version )
+ tool_requirements.append( requirement_dict )
+ # Handle tool.tests.
+ tool_tests = []
+ if tool.tests:
+ for ttb in tool.tests:
+ required_files = []
+ for required_file in ttb.required_files:
+ value, extra = required_file
+ required_files.append( ( value ) )
+ inputs = []
+ for param_name, values in ttb.inputs.iteritems():
+ # Handle improperly defined or strange test parameters and values.
+ if param_name is not None:
+ if values is None:
+ # An example is the 3rd test in http://testtoolshed.g2.bx.psu.edu/view/devteam/samtools_rmdup
+ # which is defined as:
+ # <test>
+ # <param name="input1" value="1.bam" ftype="bam" />
+ # <param name="bam_paired_end_type_selector" value="PE" />
+ # <param name="force_se" />
+ # <output name="output1" file="1.bam" ftype="bam" sort="True" />
+ # </test>
+ inputs.append( ( param_name, values ) )
+ else:
+ if len( values ) == 1:
+ inputs.append( ( param_name, values[ 0 ] ) )
+ else:
+ inputs.append( ( param_name, values ) )
+ outputs = []
+ for output in ttb.outputs:
+ name, file_name, extra = output
+ outputs.append( ( name, basic_util.strip_path( file_name ) if file_name else None ) )
+ if file_name not in required_files and file_name is not None:
+ required_files.append( file_name )
+ test_dict = dict( name=str( ttb.name ),
+ required_files=required_files,
+ inputs=inputs,
+ outputs=outputs )
+ tool_tests.append( test_dict )
+ # Determine if the tool should be loaded into the tool panel. Examples of valid tools that
+ # should not be displayed in the tool panel are datatypes converters and DataManager tools
+ # (which are of type 'manage_data').
+ datatypes = metadata_dict.get( 'datatypes', None )
+ add_to_tool_panel_attribute = self.set_add_to_tool_panel_attribute_for_tool( tool=tool,
+ guid=guid,
+ datatypes=datatypes )
+ tool_dict = dict( id=tool.id,
+ guid=guid,
+ name=tool.name,
+ version=tool.version,
+ description=tool.description,
+ version_string_cmd = tool.version_string_cmd,
+ tool_config=tool_config,
+ tool_type=tool.tool_type,
+ requirements=tool_requirements,
+ tests=tool_tests,
+ add_to_tool_panel=add_to_tool_panel_attribute )
+ if 'tools' in metadata_dict:
+ metadata_dict[ 'tools' ].append( tool_dict )
+ else:
+ metadata_dict[ 'tools' ] = [ tool_dict ]
+ return metadata_dict
+
+ def generate_tool_dependency_metadata( self, repository, changeset_revision, repository_clone_url, tool_dependencies_config,
+ metadata_dict, original_repository_metadata=None ):
+ """
+ If the combination of name, version and type of each element is defined in the <requirement> tag for
+ at least one tool in the repository, then update the received metadata_dict with information from the
+ parsed tool_dependencies_config.
+ """
+ error_message = ''
+ if original_repository_metadata:
+ # Keep a copy of the original tool dependencies dictionary and the list of tool
+ # dictionaries in the metadata.
+ original_valid_tool_dependencies_dict = original_repository_metadata.get( 'tool_dependencies', None )
+ original_invalid_tool_dependencies_dict = original_repository_metadata.get( 'invalid_tool_dependencies', None )
+ else:
+ original_valid_tool_dependencies_dict = None
+ original_invalid_tool_dependencies_dict = None
+ tree, error_message = xml_util.parse_xml( tool_dependencies_config )
+ if tree is None:
+ return metadata_dict, error_message
+ root = tree.getroot()
+ tool_dependency_is_valid = True
+ valid_tool_dependencies_dict = {}
+ invalid_tool_dependencies_dict = {}
+ valid_repository_dependency_tups = []
+ invalid_repository_dependency_tups = []
+ tools_metadata = metadata_dict.get( 'tools', None )
+ description = root.get( 'description' )
+ for elem in root:
+ if elem.tag == 'package':
+ valid_tool_dependencies_dict, \
+ invalid_tool_dependencies_dict, \
+ repository_dependency_tup, \
+ repository_dependency_is_valid, \
+ message = self.generate_package_dependency_metadata( elem,
+ valid_tool_dependencies_dict,
+ invalid_tool_dependencies_dict )
+ if repository_dependency_is_valid:
+ if repository_dependency_tup and repository_dependency_tup not in valid_repository_dependency_tups:
+ # We have a valid complex repository dependency.
+ valid_repository_dependency_tups.append( repository_dependency_tup )
+ else:
+ if repository_dependency_tup and repository_dependency_tup not in invalid_repository_dependency_tups:
+ # We have an invalid complex repository dependency, so mark the tool dependency as invalid.
+ tool_dependency_is_valid = False
+ # Append the error message to the invalid repository dependency tuple.
+ toolshed, \
+ name, \
+ owner, \
+ changeset_revision, \
+ prior_installation_required, \
+ only_if_compiling_contained_td \
+ = repository_dependency_tup
+ repository_dependency_tup = \
+ ( toolshed, \
+ name, \
+ owner, \
+ changeset_revision, \
+ prior_installation_required, \
+ only_if_compiling_contained_td, \
+ message )
+ invalid_repository_dependency_tups.append( repository_dependency_tup )
+ error_message = '%s %s' % ( error_message, message )
+ elif elem.tag == 'set_environment':
+ valid_tool_dependencies_dict = \
+ self.generate_environment_dependency_metadata( elem, valid_tool_dependencies_dict )
+ if valid_tool_dependencies_dict:
+ if original_valid_tool_dependencies_dict:
+ # We're generating metadata on an update pulled to a tool shed repository installed
+ # into a Galaxy instance, so handle changes to tool dependencies appropriately.
+ irm = self.app.installed_repository_manager
+ updated_tool_dependency_names, deleted_tool_dependency_names = \
+ irm.handle_existing_tool_dependencies_that_changed_in_update( repository,
+ original_valid_tool_dependencies_dict,
+ valid_tool_dependencies_dict )
+ metadata_dict[ 'tool_dependencies' ] = valid_tool_dependencies_dict
+ if invalid_tool_dependencies_dict:
+ metadata_dict[ 'invalid_tool_dependencies' ] = invalid_tool_dependencies_dict
+ if valid_repository_dependency_tups:
+ metadata_dict = \
+ self.update_repository_dependencies_metadata( metadata=metadata_dict,
+ repository_dependency_tups=valid_repository_dependency_tups,
+ is_valid=True,
+ description=description )
+ if invalid_repository_dependency_tups:
+ metadata_dict = \
+ self.update_repository_dependencies_metadata( metadata=metadata_dict,
+ repository_dependency_tups=invalid_repository_dependency_tups,
+ is_valid=False,
+ description=description )
+ return metadata_dict, error_message
+
+ def generate_workflow_metadata( self, relative_path, exported_workflow_dict, metadata_dict ):
+ """
+ Update the received metadata_dict with changes that have been applied to the
+ received exported_workflow_dict.
+ """
+ if 'workflows' in metadata_dict:
+ metadata_dict[ 'workflows' ].append( ( relative_path, exported_workflow_dict ) )
+ else:
+ metadata_dict[ 'workflows' ] = [ ( relative_path, exported_workflow_dict ) ]
+ return metadata_dict
+
+ def get_relative_path_to_repository_file( self, root, name, relative_install_dir, work_dir, shed_config_dict,
+ resetting_all_metadata_on_repository ):
+ if resetting_all_metadata_on_repository:
+ full_path_to_file = os.path.join( root, name )
+ stripped_path_to_file = full_path_to_file.replace( work_dir, '' )
+ if stripped_path_to_file.startswith( '/' ):
+ stripped_path_to_file = stripped_path_to_file[ 1: ]
+ relative_path_to_file = os.path.join( relative_install_dir, stripped_path_to_file )
+ else:
+ relative_path_to_file = os.path.join( root, name )
+ if relative_install_dir and \
+ shed_config_dict.get( 'tool_path' ) and \
+ relative_path_to_file.startswith( os.path.join( shed_config_dict.get( 'tool_path' ), relative_install_dir ) ):
+ relative_path_to_file = relative_path_to_file[ len( shed_config_dict.get( 'tool_path' ) ) + 1: ]
+ return relative_path_to_file
+
+ def get_sample_files_from_disk( self, repository_files_dir, tool_path=None, relative_install_dir=None,
+ resetting_all_metadata_on_repository=False ):
+ if resetting_all_metadata_on_repository:
+ # Keep track of the location where the repository is temporarily cloned so that we can strip
+ # it when setting metadata.
+ work_dir = repository_files_dir
+ sample_file_metadata_paths = []
+ sample_file_copy_paths = []
+ for root, dirs, files in os.walk( repository_files_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name.endswith( '.sample' ):
+ if resetting_all_metadata_on_repository:
+ full_path_to_sample_file = os.path.join( root, name )
+ stripped_path_to_sample_file = full_path_to_sample_file.replace( work_dir, '' )
+ if stripped_path_to_sample_file.startswith( '/' ):
+ stripped_path_to_sample_file = stripped_path_to_sample_file[ 1: ]
+ relative_path_to_sample_file = os.path.join( relative_install_dir, stripped_path_to_sample_file )
+ if os.path.exists( relative_path_to_sample_file ):
+ sample_file_copy_paths.append( relative_path_to_sample_file )
+ else:
+ sample_file_copy_paths.append( full_path_to_sample_file )
+ else:
+ relative_path_to_sample_file = os.path.join( root, name )
+ sample_file_copy_paths.append( relative_path_to_sample_file )
+ if tool_path and relative_install_dir:
+ if relative_path_to_sample_file.startswith( os.path.join( tool_path, relative_install_dir ) ):
+ relative_path_to_sample_file = relative_path_to_sample_file[ len( tool_path ) + 1 :]
+ sample_file_metadata_paths.append( relative_path_to_sample_file )
+ return sample_file_metadata_paths, sample_file_copy_paths
+
+ def handle_repository_elem( self, repository_elem, only_if_compiling_contained_td=False, updating_installed_repository=False ):
+ """
+ Process the received repository_elem which is a <repository> tag either from a
+ repository_dependencies.xml file or a tool_dependencies.xml file. If the former,
+ we're generating repository dependencies metadata for a repository in the Tool Shed.
+ If the latter, we're generating package dependency metadata within Galaxy or the
+ Tool Shed.
+ """
+ is_valid = True
+ error_message = ''
+ toolshed = repository_elem.get( 'toolshed', None )
+ name = repository_elem.get( 'name', None )
+ owner = repository_elem.get( 'owner', None )
+ changeset_revision = repository_elem.get( 'changeset_revision', None )
+ prior_installation_required = str( repository_elem.get( 'prior_installation_required', False ) )
+ if self.app.name == 'galaxy':
+ if updating_installed_repository:
+ pass
+ else:
+ # We're installing a repository into Galaxy, so make sure its contained repository
+ # dependency definition is valid.
+ if toolshed is None or name is None or owner is None or changeset_revision is None:
+ # Raise an exception here instead of returning an error_message to keep the
+ # installation from proceeding. Reaching here implies a bug in the Tool Shed
+ # framework.
+ error_message = 'Installation halted because the following repository dependency definition is invalid:\n'
+ error_message += xml_util.xml_to_string( repository_elem, use_indent=True )
+ raise Exception( error_message )
+ if not toolshed:
+ # Default to the current tool shed.
+ toolshed = str( url_for( '/', qualified=True ) ).rstrip( '/' )
+ repository_dependency_tup = [ toolshed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ str( only_if_compiling_contained_td ) ]
+ user = None
+ repository = None
+ toolshed = common_util.remove_protocol_from_tool_shed_url( toolshed )
+ if self.app.name == 'galaxy':
+ # We're in Galaxy. We reach here when we're generating the metadata for a tool
+ # dependencies package defined for a repository or when we're generating metadata
+ # for an installed repository. See if we can locate the installed repository via
+ # the changeset_revision defined in the repository_elem (it may be outdated). If
+ # we're successful in locating an installed repository with the attributes defined
+ # in the repository_elem, we know it is valid.
+ repository = suc.get_repository_for_dependency_relationship( self.app,
+ toolshed,
+ name,
+ owner,
+ changeset_revision )
+ if repository:
+ return repository_dependency_tup, is_valid, error_message
+ else:
+ # Send a request to the tool shed to retrieve appropriate additional changeset
+ # revisions with which the repository
+ # may have been installed.
+ text = suc.get_updated_changeset_revisions_from_tool_shed( self.app,
+ toolshed,
+ name,
+ owner,
+ changeset_revision )
+ if text:
+ updated_changeset_revisions = util.listify( text )
+ for updated_changeset_revision in updated_changeset_revisions:
+ repository = suc.get_repository_for_dependency_relationship( self.app,
+ toolshed,
+ name,
+ owner,
+ updated_changeset_revision )
+ if repository:
+ return repository_dependency_tup, is_valid, error_message
+ if updating_installed_repository:
+ # The repository dependency was included in an update to the installed
+ # repository, so it will not yet be installed. Return the tuple for later
+ # installation.
+ return repository_dependency_tup, is_valid, error_message
+ if updating_installed_repository:
+ # The repository dependency was included in an update to the installed repository,
+ # so it will not yet be installed. Return the tuple for later installation.
+ return repository_dependency_tup, is_valid, error_message
+ # Don't generate an error message for missing repository dependencies that are required
+ # only if compiling the dependent repository's tool dependency.
+ if not only_if_compiling_contained_td:
+ # We'll currently default to setting the repository dependency definition as invalid
+ # if an installed repository cannot be found. This may not be ideal because the tool
+ # shed may have simply been inaccessible when metadata was being generated for the
+ # installed tool shed repository.
+ error_message = "Ignoring invalid repository dependency definition for tool shed %s, name %s, owner %s, " % \
+ ( toolshed, name, owner )
+ error_message += "changeset revision %s." % changeset_revision
+ log.debug( error_message )
+ is_valid = False
+ return repository_dependency_tup, is_valid, error_message
+ else:
+ # We're in the tool shed.
+ if suc.tool_shed_is_this_tool_shed( toolshed ):
+ try:
+ user = self.sa_session.query( self.app.model.User ) \
+ .filter( self.app.model.User.table.c.username == owner ) \
+ .one()
+ except Exception, e:
+ error_message = "Ignoring repository dependency definition for tool shed %s, name %s, owner %s, " % \
+ ( toolshed, name, owner )
+ error_message += "changeset revision %s because the owner is invalid. " % changeset_revision
+ log.debug( error_message )
+ is_valid = False
+ return repository_dependency_tup, is_valid, error_message
+ try:
+ repository = self.sa_session.query( self.app.model.Repository ) \
+ .filter( and_( self.app.model.Repository.table.c.name == name,
+ self.app.model.Repository.table.c.user_id == user.id ) ) \
+ .one()
+ except:
+ error_message = "Ignoring repository dependency definition for tool shed %s, name %s, owner %s, " % \
+ ( toolshed, name, owner )
+ error_message += "changeset revision %s because the name is invalid. " % changeset_revision
+ log.debug( error_message )
+ is_valid = False
+ return repository_dependency_tup, is_valid, error_message
+ repo = hg_util.get_repo_for_repository( self.app, repository=repository, repo_path=None, create=False )
+
+ # The received changeset_revision may be None since defining it in the dependency definition is optional.
+ # If this is the case, the default will be to set its value to the repository dependency tip revision.
+ # This probably occurs only when handling circular dependency definitions.
+ tip_ctx = repo.changectx( repo.changelog.tip() )
+ # Make sure the repo.changlog includes at least 1 revision.
+ if changeset_revision is None and tip_ctx.rev() >= 0:
+ changeset_revision = str( tip_ctx )
+ repository_dependency_tup = [ toolshed,
+ name,
+ owner,
+ changeset_revision,
+ prior_installation_required,
+ str( only_if_compiling_contained_td ) ]
+ return repository_dependency_tup, is_valid, error_message
+ else:
+ # Find the specified changeset revision in the repository's changelog to see if it's valid.
+ found = False
+ for changeset in repo.changelog:
+ changeset_hash = str( repo.changectx( changeset ) )
+ if changeset_hash == changeset_revision:
+ found = True
+ break
+ if not found:
+ error_message = "Ignoring repository dependency definition for tool shed %s, name %s, owner %s, " % \
+ ( toolshed, name, owner )
+ error_message += "changeset revision %s because the changeset revision is invalid. " % changeset_revision
+ log.debug( error_message )
+ is_valid = False
+ return repository_dependency_tup, is_valid, error_message
+ else:
+ # Repository dependencies are currently supported within a single tool shed.
+ error_message = "Repository dependencies are currently supported only within the same tool shed. Ignoring "
+ error_message += "repository dependency definition for tool shed %s, name %s, owner %s, changeset revision %s. " % \
+ ( toolshed, name, owner, changeset_revision )
+ log.debug( error_message )
+ is_valid = False
+ return repository_dependency_tup, is_valid, error_message
+ return repository_dependency_tup, is_valid, error_message
+
+ def set_add_to_tool_panel_attribute_for_tool( self, tool, guid, datatypes ):
+ """
+ Determine if a tool should be loaded into the Galaxy tool panel. Examples of valid tools that
+ should not be displayed in the tool panel are datatypes converters and DataManager tools.
+ """
+ if hasattr( tool, 'tool_type' ):
+ if tool.tool_type in [ 'manage_data' ]:
+ # We have a DataManager tool.
+ return False
+ if datatypes:
+ for datatype_dict in datatypes:
+ converters = datatype_dict.get( 'converters', None )
+ # [{"converters":
+ # [{"target_datatype": "gff",
+ # "tool_config": "bed_to_gff_converter.xml",
+ # "guid": "localhost:9009/repos/test/bed_to_gff_converter/CONVERTER_bed_to_gff_0/2.0.0"}],
+ # "display_in_upload": "true",
+ # "dtype": "galaxy.datatypes.interval:Bed",
+ # "extension": "bed"}]
+ if converters:
+ for converter_dict in converters:
+ converter_guid = converter_dict.get( 'guid', None )
+ if converter_guid:
+ if converter_guid == guid:
+ # We have a datatypes converter.
+ return False
+ return True
+
+ def update_repository_dependencies_metadata( self, metadata, repository_dependency_tups, is_valid, description ):
+ if is_valid:
+ repository_dependencies_dict = metadata.get( 'repository_dependencies', None )
+ else:
+ repository_dependencies_dict = metadata.get( 'invalid_repository_dependencies', None )
+ for repository_dependency_tup in repository_dependency_tups:
+ if is_valid:
+ tool_shed, \
+ name, \
+ owner, \
+ changeset_revision, \
+ prior_installation_required, \
+ only_if_compiling_contained_td \
+ = repository_dependency_tup
+ else:
+ tool_shed, \
+ name, \
+ owner, \
+ changeset_revision, \
+ prior_installation_required, \
+ only_if_compiling_contained_td, \
+ error_message \
+ = repository_dependency_tup
+ if repository_dependencies_dict:
+ repository_dependencies = repository_dependencies_dict.get( 'repository_dependencies', [] )
+ for repository_dependency_tup in repository_dependency_tups:
+ if repository_dependency_tup not in repository_dependencies:
+ repository_dependencies.append( repository_dependency_tup )
+ repository_dependencies_dict[ 'repository_dependencies' ] = repository_dependencies
+ else:
+ repository_dependencies_dict = dict( description=description,
+ repository_dependencies=repository_dependency_tups )
+ if repository_dependencies_dict:
+ if is_valid:
+ metadata[ 'repository_dependencies' ] = repository_dependencies_dict
+ else:
+ metadata[ 'invalid_repository_dependencies' ] = repository_dependencies_dict
+ return metadata
This diff is so big that we needed to truncate the remainder.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Add a DependencyDisplayer class for Galaxy installs from the Tool Shed.
by commits-noreply@bitbucket.org 16 Jul '14
by commits-noreply@bitbucket.org 16 Jul '14
16 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/5af80141674f/
Changeset: 5af80141674f
User: greg
Date: 2014-07-16 20:30:36
Summary: Add a DependencyDisplayer class for Galaxy installs from the Tool Shed.
Affected #: 9 files
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -3,16 +3,12 @@
import shutil
from admin import AdminGalaxy
-from galaxy import eggs
from galaxy import web
from galaxy import util
from galaxy.web.form_builder import CheckboxField
-from galaxy.web.framework.helpers import grids
-from galaxy.web.framework.helpers import iff
from galaxy.util import json
from galaxy.model.orm import or_
-import tool_shed.util.shed_util_common as suc
import tool_shed.repository_types.util as rt_util
from tool_shed.util import common_util
@@ -23,11 +19,14 @@
from tool_shed.util import metadata_util
from tool_shed.util import readme_util
from tool_shed.util import repository_maintenance_util
+from tool_shed.util import shed_util_common as suc
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
from tool_shed.util import workflow_util
-from tool_shed.util import xml_util
+
+from tool_shed.galaxy_install import dependency_display
from tool_shed.galaxy_install import install_manager
+
from tool_shed.galaxy_install.repair_repository_manager import RepairRepositoryManager
import tool_shed.galaxy_install.grids.admin_toolshed_grids as admin_toolshed_grids
from tool_shed.galaxy_install.repository_dependencies import repository_dependency_manager
@@ -221,17 +220,20 @@
@web.require_admin
def deactivate_or_uninstall_repository( self, trans, **kwd ):
"""
- Handle all changes when a tool shed repository is being deactivated or uninstalled. Notice that if the repository contents include
- a file named tool_data_table_conf.xml.sample, its entries are not removed from the defined config.shed_tool_data_table_config. This
- is because it becomes a bit complex to determine if other installed repositories include tools that require the same entry. For now
- we'll never delete entries from config.shed_tool_data_table_config, but we may choose to do so in the future if it becomes necessary.
+ Handle all changes when a tool shed repository is being deactivated or uninstalled. Notice
+ that if the repository contents include a file named tool_data_table_conf.xml.sample, its
+ entries are not removed from the defined config.shed_tool_data_table_config. This is because
+ it becomes a bit complex to determine if other installed repositories include tools that
+ require the same entry. For now we'll never delete entries from config.shed_tool_data_table_config,
+ but we may choose to do so in the future if it becomes necessary.
"""
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
remove_from_disk = kwd.get( 'remove_from_disk', '' )
remove_from_disk_checked = CheckboxField.is_checked( remove_from_disk )
tool_shed_repository = suc.get_installed_tool_shed_repository( trans.app, kwd[ 'id' ] )
- shed_tool_conf, tool_path, relative_install_dir = suc.get_tool_panel_config_tool_path_install_dir( trans.app, tool_shed_repository )
+ shed_tool_conf, tool_path, relative_install_dir = \
+ suc.get_tool_panel_config_tool_path_install_dir( trans.app, tool_shed_repository )
if relative_install_dir:
if tool_path:
relative_install_dir = os.path.join( tool_path, relative_install_dir )
@@ -247,7 +249,10 @@
data_manager_util.remove_from_data_manager( trans.app, tool_shed_repository )
if tool_shed_repository.includes_datatypes:
# Deactivate proprietary datatypes.
- installed_repository_dict = datatype_util.load_installed_datatypes( trans.app, tool_shed_repository, repository_install_dir, deactivate=True )
+ installed_repository_dict = datatype_util.load_installed_datatypes( trans.app,
+ tool_shed_repository,
+ repository_install_dir,
+ deactivate=True )
if installed_repository_dict:
converter_path = installed_repository_dict.get( 'converter_path' )
if converter_path is not None:
@@ -779,12 +784,12 @@
trans.install_model.context.add( repository )
trans.install_model.context.flush()
message = "The repository information has been updated."
- containers_dict = metadata_util.populate_containers_dict_from_repository_metadata( app=trans.app,
- tool_shed_url=tool_shed_url,
- tool_path=tool_path,
- repository=repository,
- reinstalling=False,
- required_repo_info_dicts=None )
+ dd = dependency_display.DependencyDisplayer( trans.app )
+ containers_dict = dd.populate_containers_dict_from_repository_metadata( tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ repository=repository,
+ reinstalling=False,
+ required_repo_info_dicts=None )
return trans.fill_template( '/admin/tool_shed_repository/manage_repository.mako',
repository=repository,
description=description,
@@ -1049,6 +1054,7 @@
includes_tool_dependencies = util.string_as_bool( repo_information_dict.get( 'includes_tool_dependencies', False ) )
encoded_repo_info_dicts = util.listify( repo_information_dict.get( 'repo_info_dicts', [] ) )
repo_info_dicts = [ encoding_util.tool_shed_decode( encoded_repo_info_dict ) for encoded_repo_info_dict in encoded_repo_info_dicts ]
+ dd = dependency_display.DependencyDisplayer( trans.app )
install_repository_manager = install_manager.InstallRepositoryManager( trans.app )
if ( not includes_tools_for_display_in_tool_panel and kwd.get( 'select_shed_tool_panel_config_button', False ) ) or \
( includes_tools_for_display_in_tool_panel and kwd.get( 'select_tool_panel_section_button', False ) ):
@@ -1149,14 +1155,14 @@
# defined repository (and possibly tool) dependencies. In this case, merging will result in newly defined
# dependencies to be lost. We pass the updating parameter to make sure merging occurs only when appropriate.
containers_dict = \
- install_repository_manager.populate_containers_dict_for_new_install( tool_shed_url=tool_shed_url,
- tool_path=tool_path,
- readme_files_dict=readme_files_dict,
- installed_repository_dependencies=installed_repository_dependencies,
- missing_repository_dependencies=missing_repository_dependencies,
- installed_tool_dependencies=installed_tool_dependencies,
- missing_tool_dependencies=missing_tool_dependencies,
- updating=updating )
+ dd.populate_containers_dict_for_new_install( tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ readme_files_dict=readme_files_dict,
+ installed_repository_dependencies=installed_repository_dependencies,
+ missing_repository_dependencies=missing_repository_dependencies,
+ installed_tool_dependencies=installed_tool_dependencies,
+ missing_tool_dependencies=missing_tool_dependencies,
+ updating=updating )
else:
# We're installing a list of repositories, each of which may have tool dependencies or repository dependencies.
containers_dicts = []
@@ -1184,17 +1190,17 @@
name = dependencies_for_repository_dict.get( 'name', None )
repository_owner = dependencies_for_repository_dict.get( 'repository_owner', None )
containers_dict = \
- install_repository_manager.populate_containers_dict_for_new_install( tool_shed_url=tool_shed_url,
- tool_path=tool_path,
- readme_files_dict=None,
- installed_repository_dependencies=installed_repository_dependencies,
- missing_repository_dependencies=missing_repository_dependencies,
- installed_tool_dependencies=installed_tool_dependencies,
- missing_tool_dependencies=missing_tool_dependencies,
- updating=updating )
+ dd.populate_containers_dict_for_new_install( tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ readme_files_dict=None,
+ installed_repository_dependencies=installed_repository_dependencies,
+ missing_repository_dependencies=missing_repository_dependencies,
+ installed_tool_dependencies=installed_tool_dependencies,
+ missing_tool_dependencies=missing_tool_dependencies,
+ updating=updating )
containers_dicts.append( containers_dict )
# Merge all containers into a single container.
- containers_dict = install_repository_manager.merge_containers_dicts_for_new_install( containers_dicts )
+ containers_dict = dd.merge_containers_dicts_for_new_install( containers_dicts )
# Handle tool dependencies check box.
if trans.app.config.tool_dependency_dir is None:
if includes_tool_dependencies:
@@ -1647,19 +1653,20 @@
original_section_name = ''
tool_panel_section_select_field = None
shed_tool_conf_select_field = tool_util.build_shed_tool_conf_select_field( trans )
- irm = install_manager.InstallRepositoryManager( trans.app )
+ dd = dependency_display.DependencyDisplayer( trans.app )
containers_dict = \
- irm.populate_containers_dict_for_new_install( tool_shed_url=tool_shed_url,
- tool_path=tool_path,
- readme_files_dict=readme_files_dict,
- installed_repository_dependencies=installed_repository_dependencies,
- missing_repository_dependencies=missing_repository_dependencies,
- installed_tool_dependencies=installed_tool_dependencies,
- missing_tool_dependencies=missing_tool_dependencies,
- updating=False )
- # Since we're reinstalling we'll merge the list of missing repository dependencies into the list of installed repository dependencies since each displayed
- # repository dependency will display a status, whether installed or missing.
- containers_dict = irm.merge_missing_repository_dependencies_to_installed_container( containers_dict )
+ dd.populate_containers_dict_for_new_install( tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ readme_files_dict=readme_files_dict,
+ installed_repository_dependencies=installed_repository_dependencies,
+ missing_repository_dependencies=missing_repository_dependencies,
+ installed_tool_dependencies=installed_tool_dependencies,
+ missing_tool_dependencies=missing_tool_dependencies,
+ updating=False )
+ # Since we're reinstalling we'll merge the list of missing repository dependencies into the list of
+ # installed repository dependencies since each displayed repository dependency will display a status,
+ # whether installed or missing.
+ containers_dict = dd.merge_missing_repository_dependencies_to_installed_container( containers_dict )
# Handle repository dependencies check box.
install_repository_dependencies_check_box = CheckboxField( 'install_repository_dependencies', checked=True )
# Handle tool dependencies check box.
@@ -1796,12 +1803,12 @@
status = 'error'
shed_tool_conf, tool_path, relative_install_dir = suc.get_tool_panel_config_tool_path_install_dir( trans.app, repository )
repo_files_dir = os.path.abspath( os.path.join( relative_install_dir, repository.name ) )
- containers_dict = metadata_util.populate_containers_dict_from_repository_metadata( app=trans.app,
- tool_shed_url=tool_shed_url,
- tool_path=tool_path,
- repository=repository,
- reinstalling=False,
- required_repo_info_dicts=None )
+ dd = dependency_display.DependencyDisplayer( trans.app )
+ containers_dict = dd.populate_containers_dict_from_repository_metadata( tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ repository=repository,
+ reinstalling=False,
+ required_repo_info_dicts=None )
return trans.fill_template( '/admin/tool_shed_repository/manage_repository.mako',
repository=repository,
description=repository.description,
@@ -1860,7 +1867,8 @@
message = "Error attempting to uninstall tool dependencies: %s" % message
status = 'error'
else:
- message = "These tool dependencies have been uninstalled: %s" % ','.join( td.name for td in tool_dependencies_for_uninstallation )
+ message = "These tool dependencies have been uninstalled: %s" % \
+ ','.join( td.name for td in tool_dependencies_for_uninstallation )
td_ids = [ trans.security.encode_id( td.id ) for td in tool_shed_repository.tool_dependencies ]
return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
action='manage_repository_tool_dependencies',
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/galaxy/webapps/tool_shed/controllers/repository.py
--- a/lib/galaxy/webapps/tool_shed/controllers/repository.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/repository.py
@@ -6,16 +6,20 @@
from time import strftime
from datetime import date
from datetime import datetime
+
from galaxy import util
from galaxy import web
-from galaxy.util.odict import odict
from galaxy.web.base.controller import BaseUIController
from galaxy.web.form_builder import CheckboxField
from galaxy.web.framework.helpers import grids
from galaxy.util import json
from galaxy.model.orm import and_
+
from tool_shed.capsule import capsule_manager
from tool_shed.dependencies.repository import relation_builder
+
+from tool_shed.galaxy_install import dependency_display
+
from tool_shed.util import basic_util
from tool_shed.util import common_util
from tool_shed.util import container_util
@@ -24,17 +28,16 @@
from tool_shed.util import metadata_util
from tool_shed.util import readme_util
from tool_shed.util import repository_maintenance_util
-from tool_shed.util import review_util
from tool_shed.util import search_util
from tool_shed.util import shed_util_common as suc
-from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
from tool_shed.util import workflow_util
-from tool_shed.galaxy_install import install_manager
+
from galaxy.webapps.tool_shed.util import ratings_util
-import galaxy.tools
+
import tool_shed.grids.repository_grids as repository_grids
import tool_shed.grids.util as grids_util
+
import tool_shed.repository_types.util as rt_util
from galaxy import eggs
@@ -2421,9 +2424,8 @@
status = 'warning'
else:
# Handle messaging for orphan tool dependency definitions.
- orphan_message = tool_dependency_util.generate_message_for_orphan_tool_dependencies( trans,
- repository,
- metadata )
+ dd = dependency_display.DependencyDisplayer( trans.app )
+ orphan_message = dd.generate_message_for_orphan_tool_dependencies( repository, metadata )
if orphan_message:
message += orphan_message
status = 'warning'
@@ -3313,7 +3315,8 @@
repository_dependencies = rb.get_repository_dependencies_for_changeset_revision()
if str( repository.type ) != rt_util.TOOL_DEPENDENCY_DEFINITION:
# Handle messaging for orphan tool dependency definitions.
- orphan_message = tool_dependency_util.generate_message_for_orphan_tool_dependencies( trans, repository, metadata )
+ dd = dependency_display.DependencyDisplayer( trans.app )
+ orphan_message = dd.generate_message_for_orphan_tool_dependencies( repository, metadata )
if orphan_message:
message += orphan_message
status = 'warning'
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -4,18 +4,21 @@
import tarfile
import tempfile
import urllib
-from galaxy.web.base.controller import BaseUIController
+
from galaxy import util
from galaxy import web
from galaxy.datatypes import checkers
+from galaxy.web.base.controller import BaseUIController
+
from tool_shed.dependencies import attribute_handlers
+from tool_shed.galaxy_install import dependency_display
import tool_shed.repository_types.util as rt_util
+
from tool_shed.util import basic_util
from tool_shed.util import commit_util
from tool_shed.util import hg_util
from tool_shed.util import metadata_util
from tool_shed.util import shed_util_common as suc
-from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
from tool_shed.util import xml_util
@@ -274,6 +277,7 @@
metadata_dict = repository.metadata_revisions[ 0 ].metadata
else:
metadata_dict = {}
+ dd = dependency_display.DependencyDisplayer( trans.app )
if str( repository.type ) not in [ rt_util.REPOSITORY_SUITE_DEFINITION,
rt_util.TOOL_DEPENDENCY_DEFINITION ]:
change_repository_type_message = rt_util.generate_message_for_repository_type_change( trans.app,
@@ -288,15 +292,12 @@
# repository), so warning messages are important because orphans are always valid. The repository
# owner must be warned in case they did not intend to define an orphan dependency, but simply
# provided incorrect information (tool shed, name owner, changeset_revision) for the definition.
- orphan_message = tool_dependency_util.generate_message_for_orphan_tool_dependencies( trans,
- repository,
- metadata_dict )
+ orphan_message = dd.generate_message_for_orphan_tool_dependencies( repository, metadata_dict )
if orphan_message:
message += orphan_message
status = 'warning'
# Handle messaging for invalid tool dependencies.
- invalid_tool_dependencies_message = \
- tool_dependency_util.generate_message_for_invalid_tool_dependencies( metadata_dict )
+ invalid_tool_dependencies_message = dd.generate_message_for_invalid_tool_dependencies( metadata_dict )
if invalid_tool_dependencies_message:
message += invalid_tool_dependencies_message
status = 'error'
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/tool_shed/galaxy_install/dependency_display.py
--- /dev/null
+++ b/lib/tool_shed/galaxy_install/dependency_display.py
@@ -0,0 +1,575 @@
+import logging
+import os
+import threading
+
+from tool_shed.util import common_util
+from tool_shed.util import container_util
+from tool_shed.util import readme_util
+from tool_shed.util import shed_util_common as suc
+from tool_shed.util import tool_dependency_util
+
+log = logging.getLogger( __name__ )
+
+
+class DependencyDisplayer( object ):
+
+ def __init__( self, app ):
+ self.app = app
+
+ def add_installation_directories_to_tool_dependencies( self, tool_dependencies ):
+ """
+ Determine the path to the installation directory for each of the received
+ tool dependencies. This path will be displayed within the tool dependencies
+ container on the select_tool_panel_section or reselect_tool_panel_section
+ pages when installing or reinstalling repositories that contain tools with
+ the defined tool dependencies. The list of tool dependencies may be associated
+ with more than a single repository.
+ """
+ for dependency_key, requirements_dict in tool_dependencies.items():
+ if dependency_key in [ 'set_environment' ]:
+ continue
+ repository_name = requirements_dict.get( 'repository_name', 'unknown' )
+ repository_owner = requirements_dict.get( 'repository_owner', 'unknown' )
+ changeset_revision = requirements_dict.get( 'changeset_revision', 'unknown' )
+ dependency_name = requirements_dict[ 'name' ]
+ version = requirements_dict[ 'version' ]
+ type = requirements_dict[ 'type' ]
+ if self.app.config.tool_dependency_dir:
+ root_dir = self.app.config.tool_dependency_dir
+ else:
+ root_dir = '<set your tool_dependency_dir in your Galaxy configuration file>'
+ install_dir = os.path.join( root_dir,
+ dependency_name,
+ version,
+ repository_owner,
+ repository_name,
+ changeset_revision )
+ requirements_dict[ 'install_dir' ] = install_dir
+ tool_dependencies[ dependency_key ] = requirements_dict
+ return tool_dependencies
+
+ def generate_message_for_invalid_tool_dependencies( self, metadata_dict ):
+ """
+ Tool dependency definitions can only be invalid if they include a definition for a complex
+ repository dependency and the repository dependency definition is invalid. This method
+ retrieves the error message associated with the invalid tool dependency for display in the
+ caller.
+ """
+ message = ''
+ if metadata_dict:
+ invalid_tool_dependencies = metadata_dict.get( 'invalid_tool_dependencies', None )
+ if invalid_tool_dependencies:
+ for td_key, requirement_dict in invalid_tool_dependencies.items():
+ error = requirement_dict.get( 'error', None )
+ if error:
+ message = '%s ' % str( error )
+ return message
+
+ def generate_message_for_orphan_tool_dependencies( self, repository, metadata_dict ):
+ """
+ The designation of a ToolDependency into the "orphan" category has evolved over time,
+ and is significantly restricted since the introduction of the TOOL_DEPENDENCY_DEFINITION
+ repository type. This designation is still critical, however, in that it handles the
+ case where a repository contains both tools and a tool_dependencies.xml file, but the
+ definition in the tool_dependencies.xml file is in no way related to anything defined
+ by any of the contained tool's requirements tag sets. This is important in that it is
+ often a result of a typo (e.g., dependency name or version) that differs between the tool
+ dependency definition within the tool_dependencies.xml file and what is defined in the
+ tool config's <requirements> tag sets. In these cases, the user should be presented with
+ a warning message, and this warning message is is in fact displayed if the following
+ is_orphan attribute is True. This is tricky because in some cases it may be intentional,
+ and tool dependencies that are categorized as "orphan" are in fact valid.
+ """
+ has_orphan_package_dependencies = False
+ has_orphan_set_environment_dependencies = False
+ message = ''
+ package_orphans_str = ''
+ set_environment_orphans_str = ''
+ # Tool dependencies are categorized as orphan only if the repository contains tools.
+ if metadata_dict:
+ tools = metadata_dict.get( 'tools', [] )
+ invalid_tools = metadata_dict.get( 'invalid_tools', [] )
+ tool_dependencies = metadata_dict.get( 'tool_dependencies', {} )
+ # The use of the orphan_tool_dependencies category in metadata has been deprecated,
+ # but we still need to check in case the metadata is out of date.
+ orphan_tool_dependencies = metadata_dict.get( 'orphan_tool_dependencies', {} )
+ # Updating should cause no problems here since a tool dependency cannot be included
+ # in both dictionaries.
+ tool_dependencies.update( orphan_tool_dependencies )
+ if tool_dependencies and ( tools or invalid_tools ):
+ for td_key, requirements_dict in tool_dependencies.items():
+ if td_key == 'set_environment':
+ # "set_environment": [{"name": "R_SCRIPT_PATH", "type": "set_environment"}]
+ for env_requirements_dict in requirements_dict:
+ name = env_requirements_dict[ 'name' ]
+ type = env_requirements_dict[ 'type' ]
+ if self.tool_dependency_is_orphan( type, name, None, tools ):
+ if not has_orphan_set_environment_dependencies:
+ has_orphan_set_environment_dependencies = True
+ set_environment_orphans_str += "<b>* name:</b> %s, <b>type:</b> %s<br/>" % \
+ ( str( name ), str( type ) )
+ else:
+ # "R/2.15.1": {"name": "R", "readme": "some string", "type": "package", "version": "2.15.1"}
+ name = requirements_dict[ 'name' ]
+ type = requirements_dict[ 'type' ]
+ version = requirements_dict[ 'version' ]
+ if self.tool_dependency_is_orphan( type, name, version, tools ):
+ if not has_orphan_package_dependencies:
+ has_orphan_package_dependencies = True
+ package_orphans_str += "<b>* name:</b> %s, <b>type:</b> %s, <b>version:</b> %s<br/>" % \
+ ( str( name ), str( type ), str( version ) )
+ if has_orphan_package_dependencies:
+ message += "The settings for <b>name</b>, <b>version</b> and <b>type</b> from a "
+ message += "contained tool configuration file's <b>requirement</b> tag does not match "
+ message += "the information for the following tool dependency definitions in the "
+ message += "<b>tool_dependencies.xml</b> file, so these tool dependencies have no "
+ message += "relationship with any tools within this repository.<br/>"
+ message += package_orphans_str
+ if has_orphan_set_environment_dependencies:
+ message += "The settings for <b>name</b> and <b>type</b> from a contained tool "
+ message += "configuration file's <b>requirement</b> tag does not match the information "
+ message += "for the following tool dependency definitions in the <b>tool_dependencies.xml</b> "
+ message += "file, so these tool dependencies have no relationship with any tools within "
+ message += "this repository.<br/>"
+ message += set_environment_orphans_str
+ return message
+
+ def get_installed_and_missing_tool_dependencies_for_installed_repository( self, repository, all_tool_dependencies ):
+ """
+ Return the lists of installed tool dependencies and missing tool dependencies for a Tool Shed
+ repository that has been installed into Galaxy.
+ """
+ if all_tool_dependencies:
+ tool_dependencies = {}
+ missing_tool_dependencies = {}
+ for td_key, val in all_tool_dependencies.items():
+ if td_key in [ 'set_environment' ]:
+ for index, td_info_dict in enumerate( val ):
+ name = td_info_dict[ 'name' ]
+ version = None
+ type = td_info_dict[ 'type' ]
+ tool_dependency = tool_dependency_util.get_tool_dependency_by_name_type_repository( self.app,
+ repository,
+ name,
+ type )
+ if tool_dependency:
+ td_info_dict[ 'repository_id' ] = repository.id
+ td_info_dict[ 'tool_dependency_id' ] = tool_dependency.id
+ if tool_dependency.status:
+ tool_dependency_status = str( tool_dependency.status )
+ else:
+ tool_dependency_status = 'Never installed'
+ td_info_dict[ 'status' ] = tool_dependency_status
+ val[ index ] = td_info_dict
+ if tool_dependency.status == self.app.install_model.ToolDependency.installation_status.INSTALLED:
+ tool_dependencies[ td_key ] = val
+ else:
+ missing_tool_dependencies[ td_key ] = val
+ else:
+ name = val[ 'name' ]
+ version = val[ 'version' ]
+ type = val[ 'type' ]
+ tool_dependency = tool_dependency_util.get_tool_dependency_by_name_version_type_repository( self.app,
+ repository,
+ name,
+ version,
+ type )
+ if tool_dependency:
+ val[ 'repository_id' ] = repository.id
+ val[ 'tool_dependency_id' ] = tool_dependency.id
+ if tool_dependency.status:
+ tool_dependency_status = str( tool_dependency.status )
+ else:
+ tool_dependency_status = 'Never installed'
+ val[ 'status' ] = tool_dependency_status
+ if tool_dependency.status == self.app.install_model.ToolDependency.installation_status.INSTALLED:
+ tool_dependencies[ td_key ] = val
+ else:
+ missing_tool_dependencies[ td_key ] = val
+ else:
+ tool_dependencies = None
+ missing_tool_dependencies = None
+ return tool_dependencies, missing_tool_dependencies
+
+ def merge_containers_dicts_for_new_install( self, containers_dicts ):
+ """
+ When installing one or more tool shed repositories for the first time, the received list of
+ containers_dicts contains a containers_dict for each repository being installed. Since the
+ repositories are being installed for the first time, all entries are None except the repository
+ dependencies and tool dependencies. The entries for missing dependencies are all None since
+ they have previously been merged into the installed dependencies. This method will merge the
+ dependencies entries into a single container and return it for display.
+ """
+ new_containers_dict = dict( readme_files=None,
+ datatypes=None,
+ missing_repository_dependencies=None,
+ repository_dependencies=None,
+ missing_tool_dependencies=None,
+ tool_dependencies=None,
+ invalid_tools=None,
+ valid_tools=None,
+ workflows=None )
+ if containers_dicts:
+ lock = threading.Lock()
+ lock.acquire( True )
+ try:
+ repository_dependencies_root_folder = None
+ tool_dependencies_root_folder = None
+ # Use a unique folder id (hopefully the following is).
+ folder_id = 867
+ for old_container_dict in containers_dicts:
+ # Merge repository_dependencies.
+ old_container_repository_dependencies_root = old_container_dict[ 'repository_dependencies' ]
+ if old_container_repository_dependencies_root:
+ if repository_dependencies_root_folder is None:
+ repository_dependencies_root_folder = container_util.Folder( id=folder_id,
+ key='root',
+ label='root',
+ parent=None )
+ folder_id += 1
+ repository_dependencies_folder = container_util.Folder( id=folder_id,
+ key='merged',
+ label='Repository dependencies',
+ parent=repository_dependencies_root_folder )
+ folder_id += 1
+ # The old_container_repository_dependencies_root will be a root folder containing a single sub_folder.
+ old_container_repository_dependencies_folder = old_container_repository_dependencies_root.folders[ 0 ]
+ # Change the folder id so it won't confict with others being merged.
+ old_container_repository_dependencies_folder.id = folder_id
+ folder_id += 1
+ repository_components_tuple = container_util.get_components_from_key( old_container_repository_dependencies_folder.key )
+ components_list = suc.extract_components_from_tuple( repository_components_tuple )
+ name = components_list[ 1 ]
+ # Generate the label by retrieving the repository name.
+ old_container_repository_dependencies_folder.label = str( name )
+ repository_dependencies_folder.folders.append( old_container_repository_dependencies_folder )
+ # Merge tool_dependencies.
+ old_container_tool_dependencies_root = old_container_dict[ 'tool_dependencies' ]
+ if old_container_tool_dependencies_root:
+ if tool_dependencies_root_folder is None:
+ tool_dependencies_root_folder = container_util.Folder( id=folder_id,
+ key='root',
+ label='root',
+ parent=None )
+ folder_id += 1
+ tool_dependencies_folder = container_util.Folder( id=folder_id,
+ key='merged',
+ label='Tool dependencies',
+ parent=tool_dependencies_root_folder )
+ folder_id += 1
+ else:
+ td_list = [ td.listify for td in tool_dependencies_folder.tool_dependencies ]
+ # The old_container_tool_dependencies_root will be a root folder containing a single sub_folder.
+ old_container_tool_dependencies_folder = old_container_tool_dependencies_root.folders[ 0 ]
+ for td in old_container_tool_dependencies_folder.tool_dependencies:
+ if td.listify not in td_list:
+ tool_dependencies_folder.tool_dependencies.append( td )
+ if repository_dependencies_root_folder:
+ repository_dependencies_root_folder.folders.append( repository_dependencies_folder )
+ new_containers_dict[ 'repository_dependencies' ] = repository_dependencies_root_folder
+ if tool_dependencies_root_folder:
+ tool_dependencies_root_folder.folders.append( tool_dependencies_folder )
+ new_containers_dict[ 'tool_dependencies' ] = tool_dependencies_root_folder
+ except Exception, e:
+ log.debug( "Exception in merge_containers_dicts_for_new_install: %s" % str( e ) )
+ finally:
+ lock.release()
+ return new_containers_dict
+
+ def merge_missing_repository_dependencies_to_installed_container( self, containers_dict ):
+ """
+ Merge the list of missing repository dependencies into the list of installed
+ repository dependencies.
+ """
+ missing_rd_container_root = containers_dict.get( 'missing_repository_dependencies', None )
+ if missing_rd_container_root:
+ # The missing_rd_container_root will be a root folder containing a single sub_folder.
+ missing_rd_container = missing_rd_container_root.folders[ 0 ]
+ installed_rd_container_root = containers_dict.get( 'repository_dependencies', None )
+ # The installed_rd_container_root will be a root folder containing a single sub_folder.
+ if installed_rd_container_root:
+ installed_rd_container = installed_rd_container_root.folders[ 0 ]
+ installed_rd_container.label = 'Repository dependencies'
+ for index, rd in enumerate( missing_rd_container.repository_dependencies ):
+ # Skip the header row.
+ if index == 0:
+ continue
+ installed_rd_container.repository_dependencies.append( rd )
+ installed_rd_container_root.folders = [ installed_rd_container ]
+ containers_dict[ 'repository_dependencies' ] = installed_rd_container_root
+ else:
+ # Change the folder label from 'Missing repository dependencies' to be
+ # 'Repository dependencies' for display.
+ root_container = containers_dict[ 'missing_repository_dependencies' ]
+ for sub_container in root_container.folders:
+ # There should only be 1 sub-folder.
+ sub_container.label = 'Repository dependencies'
+ containers_dict[ 'repository_dependencies' ] = root_container
+ containers_dict[ 'missing_repository_dependencies' ] = None
+ return containers_dict
+
+ def merge_missing_tool_dependencies_to_installed_container( self, containers_dict ):
+ """
+ Merge the list of missing tool dependencies into the list of installed tool
+ dependencies.
+ """
+ missing_td_container_root = containers_dict.get( 'missing_tool_dependencies', None )
+ if missing_td_container_root:
+ # The missing_td_container_root will be a root folder containing a single sub_folder.
+ missing_td_container = missing_td_container_root.folders[ 0 ]
+ installed_td_container_root = containers_dict.get( 'tool_dependencies', None )
+ # The installed_td_container_root will be a root folder containing a single sub_folder.
+ if installed_td_container_root:
+ installed_td_container = installed_td_container_root.folders[ 0 ]
+ installed_td_container.label = 'Tool dependencies'
+ for index, td in enumerate( missing_td_container.tool_dependencies ):
+ # Skip the header row.
+ if index == 0:
+ continue
+ installed_td_container.tool_dependencies.append( td )
+ installed_td_container_root.folders = [ installed_td_container ]
+ containers_dict[ 'tool_dependencies' ] = installed_td_container_root
+ else:
+ # Change the folder label from 'Missing tool dependencies' to be
+ # 'Tool dependencies' for display.
+ root_container = containers_dict[ 'missing_tool_dependencies' ]
+ for sub_container in root_container.folders:
+ # There should only be 1 subfolder.
+ sub_container.label = 'Tool dependencies'
+ containers_dict[ 'tool_dependencies' ] = root_container
+ containers_dict[ 'missing_tool_dependencies' ] = None
+ return containers_dict
+
+ def populate_containers_dict_for_new_install( self, tool_shed_url, tool_path, readme_files_dict,
+ installed_repository_dependencies, missing_repository_dependencies,
+ installed_tool_dependencies, missing_tool_dependencies,
+ updating=False ):
+ """
+ Return the populated containers for a repository being installed for the first time
+ or for an installed repository that is being updated and the updates include newly
+ defined repository (and possibly tool) dependencies.
+ """
+ installed_tool_dependencies, missing_tool_dependencies = \
+ self.populate_tool_dependencies_dicts( tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ repository_installed_tool_dependencies=installed_tool_dependencies,
+ repository_missing_tool_dependencies=missing_tool_dependencies,
+ required_repo_info_dicts=None )
+ # Most of the repository contents are set to None since we don't yet know what they are.
+ containers_dict = \
+ container_util.build_repository_containers_for_galaxy( app=self.app,
+ repository=None,
+ datatypes=None,
+ invalid_tools=None,
+ missing_repository_dependencies=missing_repository_dependencies,
+ missing_tool_dependencies=missing_tool_dependencies,
+ readme_files_dict=readme_files_dict,
+ repository_dependencies=installed_repository_dependencies,
+ tool_dependencies=installed_tool_dependencies,
+ valid_tools=None,
+ workflows=None,
+ valid_data_managers=None,
+ invalid_data_managers=None,
+ data_managers_errors=None,
+ new_install=True,
+ reinstalling=False )
+ if not updating:
+ # If we installing a new repository and not updaing an installed repository, we can merge
+ # the missing_repository_dependencies container contents to the installed_repository_dependencies
+ # container. When updating an installed repository, merging will result in losing newly defined
+ # dependencies included in the updates.
+ containers_dict = self.merge_missing_repository_dependencies_to_installed_container( containers_dict )
+ # Merge the missing_tool_dependencies container contents to the installed_tool_dependencies container.
+ containers_dict = self.merge_missing_tool_dependencies_to_installed_container( containers_dict )
+ return containers_dict
+
+ def populate_containers_dict_from_repository_metadata( self, tool_shed_url, tool_path, repository, reinstalling=False,
+ required_repo_info_dicts=None ):
+ """
+ Retrieve necessary information from the received repository's metadata to populate the
+ containers_dict for display. This method is called only from Galaxy (not the tool shed)
+ when displaying repository dependencies for installed repositories and when displaying
+ them for uninstalled repositories that are being reinstalled.
+ """
+ metadata = repository.metadata
+ if metadata:
+ # Handle proprietary datatypes.
+ datatypes = metadata.get( 'datatypes', None )
+ # Handle invalid tools.
+ invalid_tools = metadata.get( 'invalid_tools', None )
+ # Handle README files.
+ if repository.has_readme_files:
+ if reinstalling or repository.status not in \
+ [ self.app.install_model.ToolShedRepository.installation_status.DEACTIVATED,
+ self.app.install_model.ToolShedRepository.installation_status.INSTALLED ]:
+ # Since we're reinstalling, we need to send a request to the tool shed to get the README files.
+ tool_shed_url = common_util.get_tool_shed_url_from_tool_shed_registry( self.app, tool_shed_url )
+ params = '?name=%s&owner=%s&changeset_revision=%s' % ( str( repository.name ),
+ str( repository.owner ),
+ str( repository.installed_changeset_revision ) )
+ url = common_util.url_join( tool_shed_url,
+ 'repository/get_readme_files%s' % params )
+ raw_text = common_util.tool_shed_get( self.app, tool_shed_url, url )
+ readme_files_dict = json.from_json_string( raw_text )
+ else:
+ readme_files_dict = readme_util.build_readme_files_dict( self.app,
+ repository,
+ repository.changeset_revision,
+ repository.metadata, tool_path )
+ else:
+ readme_files_dict = None
+ # Handle repository dependencies.
+ installed_repository_dependencies, missing_repository_dependencies = \
+ self.app.installed_repository_manager.get_installed_and_missing_repository_dependencies( repository )
+ # Handle the current repository's tool dependencies.
+ repository_tool_dependencies = metadata.get( 'tool_dependencies', None )
+ # Make sure to display missing tool dependencies as well.
+ repository_invalid_tool_dependencies = metadata.get( 'invalid_tool_dependencies', None )
+ if repository_invalid_tool_dependencies is not None:
+ if repository_tool_dependencies is None:
+ repository_tool_dependencies = {}
+ repository_tool_dependencies.update( repository_invalid_tool_dependencies )
+ repository_installed_tool_dependencies, repository_missing_tool_dependencies = \
+ self.get_installed_and_missing_tool_dependencies_for_installed_repository( repository,
+ repository_tool_dependencies )
+ if reinstalling:
+ installed_tool_dependencies, missing_tool_dependencies = \
+ self.populate_tool_dependencies_dicts( tool_shed_url,
+ tool_path,
+ repository_installed_tool_dependencies,
+ repository_missing_tool_dependencies,
+ required_repo_info_dicts )
+ else:
+ installed_tool_dependencies = repository_installed_tool_dependencies
+ missing_tool_dependencies = repository_missing_tool_dependencies
+ # Handle valid tools.
+ valid_tools = metadata.get( 'tools', None )
+ # Handle workflows.
+ workflows = metadata.get( 'workflows', None )
+ # Handle Data Managers
+ valid_data_managers = None
+ invalid_data_managers = None
+ data_managers_errors = None
+ if 'data_manager' in metadata:
+ valid_data_managers = metadata['data_manager'].get( 'data_managers', None )
+ invalid_data_managers = metadata['data_manager'].get( 'invalid_data_managers', None )
+ data_managers_errors = metadata['data_manager'].get( 'messages', None )
+ containers_dict = \
+ container_util.build_repository_containers_for_galaxy( app=self.app,
+ repository=repository,
+ datatypes=datatypes,
+ invalid_tools=invalid_tools,
+ missing_repository_dependencies=missing_repository_dependencies,
+ missing_tool_dependencies=missing_tool_dependencies,
+ readme_files_dict=readme_files_dict,
+ repository_dependencies=installed_repository_dependencies,
+ tool_dependencies=installed_tool_dependencies,
+ valid_tools=valid_tools,
+ workflows=workflows,
+ valid_data_managers=valid_data_managers,
+ invalid_data_managers=invalid_data_managers,
+ data_managers_errors=data_managers_errors,
+ new_install=False,
+ reinstalling=reinstalling )
+ else:
+ containers_dict = dict( datatypes=None,
+ invalid_tools=None,
+ readme_files_dict=None,
+ repository_dependencies=None,
+ tool_dependencies=None,
+ valid_tools=None,
+ workflows=None )
+ return containers_dict
+
+ def populate_tool_dependencies_dicts( self, tool_shed_url, tool_path, repository_installed_tool_dependencies,
+ repository_missing_tool_dependencies, required_repo_info_dicts ):
+ """
+ Return the populated installed_tool_dependencies and missing_tool_dependencies dictionaries
+ for all repositories defined by entries in the received required_repo_info_dicts.
+ """
+ installed_tool_dependencies = None
+ missing_tool_dependencies = None
+ if repository_installed_tool_dependencies is None:
+ repository_installed_tool_dependencies = {}
+ else:
+ # Add the install_dir attribute to the tool_dependencies.
+ repository_installed_tool_dependencies = \
+ self.add_installation_directories_to_tool_dependencies( repository_installed_tool_dependencies )
+ if repository_missing_tool_dependencies is None:
+ repository_missing_tool_dependencies = {}
+ else:
+ # Add the install_dir attribute to the tool_dependencies.
+ repository_missing_tool_dependencies = \
+ self.add_installation_directories_to_tool_dependencies( repository_missing_tool_dependencies )
+ if required_repo_info_dicts:
+ # Handle the tool dependencies defined for each of the repository's repository dependencies.
+ for rid in required_repo_info_dicts:
+ for name, repo_info_tuple in rid.items():
+ description, \
+ repository_clone_url, \
+ changeset_revision, \
+ ctx_rev, \
+ repository_owner, \
+ repository_dependencies, \
+ tool_dependencies = \
+ suc.get_repo_info_tuple_contents( repo_info_tuple )
+ if tool_dependencies:
+ # Add the install_dir attribute to the tool_dependencies.
+ tool_dependencies = self.add_installation_directories_to_tool_dependencies( tool_dependencies )
+ # The required_repository may have been installed with a different changeset revision.
+ required_repository, installed_changeset_revision = \
+ suc.repository_was_previously_installed( self.app,
+ tool_shed_url,
+ name,
+ repo_info_tuple,
+ from_tip=False )
+ if required_repository:
+ required_repository_installed_tool_dependencies, required_repository_missing_tool_dependencies = \
+ self.get_installed_and_missing_tool_dependencies_for_installed_repository( required_repository,
+ tool_dependencies )
+ if required_repository_installed_tool_dependencies:
+ # Add the install_dir attribute to the tool_dependencies.
+ required_repository_installed_tool_dependencies = \
+ self.add_installation_directories_to_tool_dependencies( required_repository_installed_tool_dependencies )
+ for td_key, td_dict in required_repository_installed_tool_dependencies.items():
+ if td_key not in repository_installed_tool_dependencies:
+ repository_installed_tool_dependencies[ td_key ] = td_dict
+ if required_repository_missing_tool_dependencies:
+ # Add the install_dir attribute to the tool_dependencies.
+ required_repository_missing_tool_dependencies = \
+ self.add_installation_directories_to_tool_dependencies( required_repository_missing_tool_dependencies )
+ for td_key, td_dict in required_repository_missing_tool_dependencies.items():
+ if td_key not in repository_missing_tool_dependencies:
+ repository_missing_tool_dependencies[ td_key ] = td_dict
+ if repository_installed_tool_dependencies:
+ installed_tool_dependencies = repository_installed_tool_dependencies
+ if repository_missing_tool_dependencies:
+ missing_tool_dependencies = repository_missing_tool_dependencies
+ return installed_tool_dependencies, missing_tool_dependencies
+
+ def tool_dependency_is_orphan( self, type, name, version, tools ):
+ """
+ Determine if the combination of the received type, name and version is defined in the <requirement>
+ tag for at least one tool in the received list of tools. If not, the tool dependency defined by the
+ combination is considered an orphan in its repository in the tool shed.
+ """
+ if type == 'package':
+ if name and version:
+ for tool_dict in tools:
+ requirements = tool_dict.get( 'requirements', [] )
+ for requirement_dict in requirements:
+ req_name = requirement_dict.get( 'name', None )
+ req_version = requirement_dict.get( 'version', None )
+ req_type = requirement_dict.get( 'type', None )
+ if req_name == name and req_version == version and req_type == type:
+ return False
+ elif type == 'set_environment':
+ if name:
+ for tool_dict in tools:
+ requirements = tool_dict.get( 'requirements', [] )
+ for requirement_dict in requirements:
+ req_name = requirement_dict.get( 'name', None )
+ req_type = requirement_dict.get( 'type', None )
+ if req_name == name and req_type == type:
+ return False
+ return True
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/tool_shed/galaxy_install/install_manager.py
--- a/lib/tool_shed/galaxy_install/install_manager.py
+++ b/lib/tool_shed/galaxy_install/install_manager.py
@@ -3,7 +3,6 @@
import os
import sys
import tempfile
-import threading
import traceback
from galaxy import exceptions
@@ -19,7 +18,6 @@
from tool_shed.util import basic_util
from tool_shed.util import common_util
-from tool_shed.util import container_util
from tool_shed.util import data_manager_util
from tool_shed.util import datatype_util
from tool_shed.util import encoding_util
@@ -225,7 +223,8 @@
installed_packages.append( tool_dependency )
if self.app.config.manage_dependency_relationships:
# Add the tool_dependency to the in-memory dictionaries in the installed_repository_manager.
- self.app.installed_repository_manager.handle_tool_dependency_install( tool_shed_repository, tool_dependency )
+ self.app.installed_repository_manager.handle_tool_dependency_install( tool_shed_repository,
+ tool_dependency )
return installed_packages
def install_via_fabric( self, tool_shed_repository, tool_dependency, install_dir, package_name=None, custom_fabfile_path=None,
@@ -371,8 +370,8 @@
# Delete contents of installation directory if attempt at binary installation failed.
installation_directory_contents = os.listdir( installation_directory )
if installation_directory_contents:
- removed, error_message = tool_dependency_util.remove_tool_dependency( self.app,
- tool_dependency )
+ removed, error_message = \
+ tool_dependency_util.remove_tool_dependency( self.app, tool_dependency )
if removed:
can_install_from_source = True
else:
@@ -894,7 +893,8 @@
tool_shed_repository,
install_model.ToolShedRepository.installation_status.INSTALLED )
if self.app.config.manage_dependency_relationships:
- # Add the installed repository and any tool dependencies to the in-memory dictionaries in the installed_repository_manager.
+ # Add the installed repository and any tool dependencies to the in-memory dictionaries
+ # in the installed_repository_manager.
self.app.installed_repository_manager.handle_repository_install( tool_shed_repository )
else:
# An error occurred while cloning the repository, so reset everything necessary to enable another attempt.
@@ -906,147 +906,6 @@
uninstalled=False,
remove_from_disk=True )
- def merge_containers_dicts_for_new_install( self, containers_dicts ):
- """
- When installing one or more tool shed repositories for the first time, the received list of
- containers_dicts contains a containers_dict for each repository being installed. Since the
- repositories are being installed for the first time, all entries are None except the repository
- dependencies and tool dependencies. The entries for missing dependencies are all None since
- they have previously been merged into the installed dependencies. This method will merge the
- dependencies entries into a single container and return it for display.
- """
- new_containers_dict = dict( readme_files=None,
- datatypes=None,
- missing_repository_dependencies=None,
- repository_dependencies=None,
- missing_tool_dependencies=None,
- tool_dependencies=None,
- invalid_tools=None,
- valid_tools=None,
- workflows=None )
- if containers_dicts:
- lock = threading.Lock()
- lock.acquire( True )
- try:
- repository_dependencies_root_folder = None
- tool_dependencies_root_folder = None
- # Use a unique folder id (hopefully the following is).
- folder_id = 867
- for old_container_dict in containers_dicts:
- # Merge repository_dependencies.
- old_container_repository_dependencies_root = old_container_dict[ 'repository_dependencies' ]
- if old_container_repository_dependencies_root:
- if repository_dependencies_root_folder is None:
- repository_dependencies_root_folder = container_util.Folder( id=folder_id,
- key='root',
- label='root',
- parent=None )
- folder_id += 1
- repository_dependencies_folder = container_util.Folder( id=folder_id,
- key='merged',
- label='Repository dependencies',
- parent=repository_dependencies_root_folder )
- folder_id += 1
- # The old_container_repository_dependencies_root will be a root folder containing a single sub_folder.
- old_container_repository_dependencies_folder = old_container_repository_dependencies_root.folders[ 0 ]
- # Change the folder id so it won't confict with others being merged.
- old_container_repository_dependencies_folder.id = folder_id
- folder_id += 1
- repository_components_tuple = container_util.get_components_from_key( old_container_repository_dependencies_folder.key )
- components_list = suc.extract_components_from_tuple( repository_components_tuple )
- name = components_list[ 1 ]
- # Generate the label by retrieving the repository name.
- old_container_repository_dependencies_folder.label = str( name )
- repository_dependencies_folder.folders.append( old_container_repository_dependencies_folder )
- # Merge tool_dependencies.
- old_container_tool_dependencies_root = old_container_dict[ 'tool_dependencies' ]
- if old_container_tool_dependencies_root:
- if tool_dependencies_root_folder is None:
- tool_dependencies_root_folder = container_util.Folder( id=folder_id,
- key='root',
- label='root',
- parent=None )
- folder_id += 1
- tool_dependencies_folder = container_util.Folder( id=folder_id,
- key='merged',
- label='Tool dependencies',
- parent=tool_dependencies_root_folder )
- folder_id += 1
- else:
- td_list = [ td.listify for td in tool_dependencies_folder.tool_dependencies ]
- # The old_container_tool_dependencies_root will be a root folder containing a single sub_folder.
- old_container_tool_dependencies_folder = old_container_tool_dependencies_root.folders[ 0 ]
- for td in old_container_tool_dependencies_folder.tool_dependencies:
- if td.listify not in td_list:
- tool_dependencies_folder.tool_dependencies.append( td )
- if repository_dependencies_root_folder:
- repository_dependencies_root_folder.folders.append( repository_dependencies_folder )
- new_containers_dict[ 'repository_dependencies' ] = repository_dependencies_root_folder
- if tool_dependencies_root_folder:
- tool_dependencies_root_folder.folders.append( tool_dependencies_folder )
- new_containers_dict[ 'tool_dependencies' ] = tool_dependencies_root_folder
- except Exception, e:
- log.debug( "Exception in merge_containers_dicts_for_new_install: %s" % str( e ) )
- finally:
- lock.release()
- return new_containers_dict
-
- def merge_missing_repository_dependencies_to_installed_container( self, containers_dict ):
- """Merge the list of missing repository dependencies into the list of installed repository dependencies."""
- missing_rd_container_root = containers_dict.get( 'missing_repository_dependencies', None )
- if missing_rd_container_root:
- # The missing_rd_container_root will be a root folder containing a single sub_folder.
- missing_rd_container = missing_rd_container_root.folders[ 0 ]
- installed_rd_container_root = containers_dict.get( 'repository_dependencies', None )
- # The installed_rd_container_root will be a root folder containing a single sub_folder.
- if installed_rd_container_root:
- installed_rd_container = installed_rd_container_root.folders[ 0 ]
- installed_rd_container.label = 'Repository dependencies'
- for index, rd in enumerate( missing_rd_container.repository_dependencies ):
- # Skip the header row.
- if index == 0:
- continue
- installed_rd_container.repository_dependencies.append( rd )
- installed_rd_container_root.folders = [ installed_rd_container ]
- containers_dict[ 'repository_dependencies' ] = installed_rd_container_root
- else:
- # Change the folder label from 'Missing repository dependencies' to be 'Repository dependencies' for display.
- root_container = containers_dict[ 'missing_repository_dependencies' ]
- for sub_container in root_container.folders:
- # There should only be 1 sub-folder.
- sub_container.label = 'Repository dependencies'
- containers_dict[ 'repository_dependencies' ] = root_container
- containers_dict[ 'missing_repository_dependencies' ] = None
- return containers_dict
-
- def merge_missing_tool_dependencies_to_installed_container( self, containers_dict ):
- """ Merge the list of missing tool dependencies into the list of installed tool dependencies."""
- missing_td_container_root = containers_dict.get( 'missing_tool_dependencies', None )
- if missing_td_container_root:
- # The missing_td_container_root will be a root folder containing a single sub_folder.
- missing_td_container = missing_td_container_root.folders[ 0 ]
- installed_td_container_root = containers_dict.get( 'tool_dependencies', None )
- # The installed_td_container_root will be a root folder containing a single sub_folder.
- if installed_td_container_root:
- installed_td_container = installed_td_container_root.folders[ 0 ]
- installed_td_container.label = 'Tool dependencies'
- for index, td in enumerate( missing_td_container.tool_dependencies ):
- # Skip the header row.
- if index == 0:
- continue
- installed_td_container.tool_dependencies.append( td )
- installed_td_container_root.folders = [ installed_td_container ]
- containers_dict[ 'tool_dependencies' ] = installed_td_container_root
- else:
- # Change the folder label from 'Missing tool dependencies' to be 'Tool dependencies' for display.
- root_container = containers_dict[ 'missing_tool_dependencies' ]
- for sub_container in root_container.folders:
- # There should only be 1 subfolder.
- sub_container.label = 'Tool dependencies'
- containers_dict[ 'tool_dependencies' ] = root_container
- containers_dict[ 'missing_tool_dependencies' ] = None
- return containers_dict
-
def order_components_for_installation( self, tsr_ids, repo_info_dicts, tool_panel_section_keys ):
"""
Some repositories may have repository dependencies that are required to be installed
@@ -1096,45 +955,3 @@
ordered_repo_info_dicts.append( repo_info_dict )
ordered_tool_panel_section_keys.append( tool_panel_section_key )
return ordered_tsr_ids, ordered_repo_info_dicts, ordered_tool_panel_section_keys
-
- def populate_containers_dict_for_new_install( self, tool_shed_url, tool_path, readme_files_dict, installed_repository_dependencies,
- missing_repository_dependencies, installed_tool_dependencies, missing_tool_dependencies,
- updating=False ):
- """
- Return the populated containers for a repository being installed for the first time or for an installed repository
- that is being updated and the updates include newly defined repository (and possibly tool) dependencies.
- """
- installed_tool_dependencies, missing_tool_dependencies = \
- tool_dependency_util.populate_tool_dependencies_dicts( app=self.app,
- tool_shed_url=tool_shed_url,
- tool_path=tool_path,
- repository_installed_tool_dependencies=installed_tool_dependencies,
- repository_missing_tool_dependencies=missing_tool_dependencies,
- required_repo_info_dicts=None )
- # Most of the repository contents are set to None since we don't yet know what they are.
- containers_dict = \
- container_util.build_repository_containers_for_galaxy( app=self.app,
- repository=None,
- datatypes=None,
- invalid_tools=None,
- missing_repository_dependencies=missing_repository_dependencies,
- missing_tool_dependencies=missing_tool_dependencies,
- readme_files_dict=readme_files_dict,
- repository_dependencies=installed_repository_dependencies,
- tool_dependencies=installed_tool_dependencies,
- valid_tools=None,
- workflows=None,
- valid_data_managers=None,
- invalid_data_managers=None,
- data_managers_errors=None,
- new_install=True,
- reinstalling=False )
- if not updating:
- # If we installing a new repository and not updaing an installed repository, we can merge
- # the missing_repository_dependencies container contents to the installed_repository_dependencies
- # container. When updating an installed repository, merging will result in losing newly defined
- # dependencies included in the updates.
- containers_dict = self.merge_missing_repository_dependencies_to_installed_container( containers_dict )
- # Merge the missing_tool_dependencies container contents to the installed_tool_dependencies container.
- containers_dict = self.merge_missing_tool_dependencies_to_installed_container( containers_dict )
- return containers_dict
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/tool_shed/galaxy_install/installed_repository_manager.py
--- a/lib/tool_shed/galaxy_install/installed_repository_manager.py
+++ b/lib/tool_shed/galaxy_install/installed_repository_manager.py
@@ -395,7 +395,13 @@
installed_repository_dependencies = {}
missing_rd_tups = []
installed_rd_tups = []
- description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, tool_dependencies = \
+ description, \
+ repository_clone_url, \
+ changeset_revision, \
+ ctx_rev, \
+ repository_owner, \
+ repository_dependencies, \
+ tool_dependencies = \
suc.get_repo_info_tuple_contents( repo_info_tuple )
if repository_dependencies:
description = repository_dependencies[ 'description' ]
@@ -614,6 +620,30 @@
type = str( tool_dependency.type )
return ( tool_dependency.tool_shed_repository_id, str( tool_dependency.name ), str( tool_dependency.version ), type )
+ def handle_existing_tool_dependencies_that_changed_in_update( self, repository, original_dependency_dict,
+ new_dependency_dict ):
+ """
+ This method is called when a Galaxy admin is getting updates for an installed tool shed
+ repository in order to cover the case where an existing tool dependency was changed (e.g.,
+ the version of the dependency was changed) but the tool version for which it is a dependency
+ was not changed. In this case, we only want to determine if any of the dependency information
+ defined in original_dependency_dict was changed in new_dependency_dict. We don't care if new
+ dependencies were added in new_dependency_dict since they will just be treated as missing
+ dependencies for the tool.
+ """
+ updated_tool_dependency_names = []
+ deleted_tool_dependency_names = []
+ for original_dependency_key, original_dependency_val_dict in original_dependency_dict.items():
+ if original_dependency_key not in new_dependency_dict:
+ updated_tool_dependency = self.update_existing_tool_dependency( repository,
+ original_dependency_val_dict,
+ new_dependency_dict )
+ if updated_tool_dependency:
+ updated_tool_dependency_names.append( updated_tool_dependency.name )
+ else:
+ deleted_tool_dependency_names.append( original_dependency_val_dict[ 'name' ] )
+ return updated_tool_dependency_names, deleted_tool_dependency_names
+
def handle_repository_install( self, repository ):
"""Load the dependency relationships for a repository that was just installed or reinstalled."""
# Populate self.repository_dependencies_of_installed_repositories.
@@ -924,3 +954,68 @@
repository_dependency.changeset_revision == changeset_revision ):
return True
return False
+
+ def update_existing_tool_dependency( self, repository, original_dependency_dict, new_dependencies_dict ):
+ """
+ Update an exsiting tool dependency whose definition was updated in a change set
+ pulled by a Galaxy administrator when getting updates to an installed tool shed
+ repository. The original_dependency_dict is a single tool dependency definition,
+ an example of which is::
+
+ {"name": "bwa",
+ "readme": "\\nCompiling BWA requires zlib and libpthread to be present on your system.\\n ",
+ "type": "package",
+ "version": "0.6.2"}
+
+ The new_dependencies_dict is the dictionary generated by the metadata_util.generate_tool_dependency_metadata method.
+ """
+ new_tool_dependency = None
+ original_name = original_dependency_dict[ 'name' ]
+ original_type = original_dependency_dict[ 'type' ]
+ original_version = original_dependency_dict[ 'version' ]
+ # Locate the appropriate tool_dependency associated with the repository.
+ tool_dependency = None
+ for tool_dependency in repository.tool_dependencies:
+ if tool_dependency.name == original_name and \
+ tool_dependency.type == original_type and \
+ tool_dependency.version == original_version:
+ break
+ if tool_dependency and tool_dependency.can_update:
+ dependency_install_dir = tool_dependency.installation_directory( self.app )
+ removed_from_disk, error_message = \
+ tool_dependency_util.remove_tool_dependency_installation_directory( dependency_install_dir )
+ if removed_from_disk:
+ context = self.app.install_model.context
+ new_dependency_name = None
+ new_dependency_type = None
+ new_dependency_version = None
+ for new_dependency_key, new_dependency_val_dict in new_dependencies_dict.items():
+ # Match on name only, hopefully this will be enough!
+ if original_name == new_dependency_val_dict[ 'name' ]:
+ new_dependency_name = new_dependency_val_dict[ 'name' ]
+ new_dependency_type = new_dependency_val_dict[ 'type' ]
+ new_dependency_version = new_dependency_val_dict[ 'version' ]
+ break
+ if new_dependency_name and new_dependency_type and new_dependency_version:
+ # Update all attributes of the tool_dependency record in the database.
+ log.debug( "Updating version %s of tool dependency %s %s to have new version %s and type %s." % \
+ ( str( tool_dependency.version ),
+ str( tool_dependency.type ),
+ str( tool_dependency.name ),
+ str( new_dependency_version ),
+ str( new_dependency_type ) ) )
+ tool_dependency.type = new_dependency_type
+ tool_dependency.version = new_dependency_version
+ tool_dependency.status = self.app.install_model.ToolDependency.installation_status.UNINSTALLED
+ tool_dependency.error_message = None
+ context.add( tool_dependency )
+ context.flush()
+ new_tool_dependency = tool_dependency
+ else:
+ # We have no new tool dependency definition based on a matching dependency name, so remove
+ # the existing tool dependency record from the database.
+ log.debug( "Deleting version %s of tool dependency %s %s from the database since it is no longer defined." % \
+ ( str( tool_dependency.version ), str( tool_dependency.type ), str( tool_dependency.name ) ) )
+ context.delete( tool_dependency )
+ context.flush()
+ return new_tool_dependency
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/tool_shed/galaxy_install/tool_dependencies/recipe/tag_handler.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/recipe/tag_handler.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/recipe/tag_handler.py
@@ -25,7 +25,89 @@
raise "Unimplemented Method"
-class Install( RecipeTag ):
+class SyncDatabase( object ):
+
+ def sync_database_with_file_system( self, app, tool_shed_repository, tool_dependency_name, tool_dependency_version,
+ tool_dependency_install_dir, tool_dependency_type='package' ):
+ """
+ The installation directory defined by the received tool_dependency_install_dir exists, so check for
+ the presence of INSTALLATION_LOG. If the files exists, we'll assume the tool dependency is installed,
+ but not necessarily successfully (it could be in an error state on disk. However, we can justifiably
+ assume here that no matter the state, an associated database record will exist.
+ """
+ # This method should be reached very rarely. It implies that either the Galaxy environment
+ # became corrupted (i.e., the database records for installed tool dependencies is not synchronized
+ # with tool dependencies on disk) or the Tool Shed's install and test framework is running. The Tool
+ # Shed's install and test framework installs repositories in 2 stages, those of type tool_dependency_definition
+ # followed by those containing valid tools and tool functional test components.
+ log.debug( "Synchronizing the database with the file system..." )
+ try:
+ log.debug( "The value of app.config.running_functional_tests is: %s" % \
+ str( app.config.running_functional_tests ) )
+ except:
+ pass
+ sa_session = app.install_model.context
+ can_install_tool_dependency = False
+ tool_dependency = get_tool_dependency_by_name_version_type_repository( app,
+ tool_shed_repository,
+ tool_dependency_name,
+ tool_dependency_version,
+ tool_dependency_type )
+ if tool_dependency.status == app.install_model.ToolDependency.installation_status.INSTALLING:
+ # The tool dependency is in an Installing state, so we don't want to do anything to it. If the tool
+ # dependency is being installed by someone else, we don't want to interfere with that. This assumes
+ # the installation by "someone else" is not hung in an Installing state, which is a weakness if that
+ # "someone else" never repaired it.
+ log.debug( 'Skipping installation of tool dependency %s version %s because it has a status of %s' % \
+ ( str( tool_dependency.name ), str( tool_dependency.version ), str( tool_dependency.status ) ) )
+ else:
+ # We have a pre-existing installation directory on the file system, but our associated database record is
+ # in a state that allowed us to arrive here. At this point, we'll inspect the installation directory to
+ # see if we have a "proper installation" and if so, synchronize the database record rather than reinstalling
+ # the dependency if we're "running_functional_tests". If we're not "running_functional_tests, we'll set
+ # the tool dependency's installation status to ERROR.
+ tool_dependency_installation_directory_contents = os.listdir( tool_dependency_install_dir )
+ if basic_util.INSTALLATION_LOG in tool_dependency_installation_directory_contents:
+ # Since this tool dependency's installation directory contains an installation log, we consider it to be
+ # installed. In some cases the record may be missing from the database due to some activity outside of
+ # the control of the Tool Shed. Since a new record was created for it and we don't know the state of the
+ # files on disk, we will set it to an error state (unless we are running Tool Shed functional tests - see
+ # below).
+ log.debug( 'Skipping installation of tool dependency %s version %s because it is installed in %s' % \
+ ( str( tool_dependency.name ), str( tool_dependency.version ), str( tool_dependency_install_dir ) ) )
+ if app.config.running_functional_tests:
+ # If we are running functional tests, the state will be set to Installed because previously compiled
+ # tool dependencies are not deleted by default, from the "install and test" framework..
+ tool_dependency.status = app.install_model.ToolDependency.installation_status.INSTALLED
+ else:
+ error_message = 'The installation directory for this tool dependency had contents but the database had no record. '
+ error_message += 'The installation log may show this tool dependency to be correctly installed, but due to the '
+ error_message += 'missing database record it is now being set to Error.'
+ tool_dependency.status = app.install_model.ToolDependency.installation_status.ERROR
+ tool_dependency.error_message = error_message
+ else:
+ error_message = '\nInstallation path %s for tool dependency %s version %s exists, but the expected file %s' % \
+ ( str( tool_dependency_install_dir ),
+ str( tool_dependency_name ),
+ str( tool_dependency_version ),
+ str( basic_util.INSTALLATION_LOG ) )
+ error_message += ' is missing. This indicates an installation error so the tool dependency is being'
+ error_message += ' prepared for re-installation.'
+ print error_message
+ tool_dependency.status = app.install_model.ToolDependency.installation_status.NEVER_INSTALLED
+ basic_util.remove_dir( tool_dependency_install_dir )
+ can_install_tool_dependency = True
+ sa_session.add( tool_dependency )
+ sa_session.flush()
+ try:
+ log.debug( "Returning from sync_database_with_file_system with tool_dependency %s, can_install_tool_dependency %s." % \
+ ( str( tool_dependency.name ), str( can_install_tool_dependency ) ) )
+ except Exception, e:
+ log.debug( str( e ) )
+ return tool_dependency, can_install_tool_dependency
+
+
+class Install( RecipeTag, SyncDatabase ):
def __init__( self, app ):
self.app = app
@@ -57,13 +139,12 @@
proceed_with_install = True
else:
# Notice that we'll throw away the following tool_dependency if it can be installed.
- tool_dependency, proceed_with_install = \
- tool_dependency_util.sync_database_with_file_system( self.app,
- tool_shed_repository,
- package_name,
- package_version,
- install_dir,
- tool_dependency_type='package' )
+ tool_dependency, proceed_with_install = self.sync_database_with_file_system( self.app,
+ tool_shed_repository,
+ package_name,
+ package_version,
+ install_dir,
+ tool_dependency_type='package' )
if not proceed_with_install:
log.debug( "Tool dependency %s version %s cannot be installed (it was probably previously installed), so returning it." % \
( str( tool_dependency.name ), str( tool_dependency.version ) ) )
@@ -156,7 +237,7 @@
return tool_dependency, proceed_with_install, action_elem_tuples
-class Repository( RecipeTag ):
+class Repository( RecipeTag, SyncDatabase ):
def __init__( self, app ):
self.app = app
@@ -190,7 +271,8 @@
return None
def create_tool_dependency_with_initialized_env_sh_file( self, dependent_install_dir, tool_shed_repository,
- required_repository, package_name, package_version, tool_dependencies_config ):
+ required_repository, package_name, package_version,
+ tool_dependencies_config ):
"""
Create or get a tool_dependency record that is defined by the received package_name and package_version.
An env.sh file will be created for the tool_dependency in the received dependent_install_dir.
@@ -238,10 +320,9 @@
# the path defined by required_tool_dependency_env_file_path. It doesn't matter if the required env.sh
# file currently exists..
required_tool_dependency_env_file_path = \
- tool_dependency_util.get_required_repository_package_env_sh_path( self.app,
- package_name,
- package_version,
- required_repository )
+ self.get_required_repository_package_env_sh_path( package_name,
+ package_version,
+ required_repository )
env_file_builder = EnvFileBuilder( tool_dependency.installation_directory( self.app ) )
env_file_builder.append_line( action="source", value=required_tool_dependency_env_file_path )
return_code = env_file_builder.return_code
@@ -269,6 +350,19 @@
tool_dependencies.append( tool_dependency )
return tool_dependencies
+ def get_required_repository_package_env_sh_path( self, package_name, package_version, required_repository ):
+ """Return path to env.sh file in required repository if the required repository has been installed."""
+ env_sh_file_dir = \
+ tool_dependency_util.get_tool_dependency_install_dir( app=self.app,
+ repository_name=required_repository.name,
+ repository_owner=required_repository.owner,
+ repository_changeset_revision=required_repository.installed_changeset_revision,
+ tool_dependency_type='package',
+ tool_dependency_name=package_name,
+ tool_dependency_version=package_version )
+ env_sh_file_path = os.path.join( env_sh_file_dir, 'env.sh' )
+ return env_sh_file_path
+
def get_tool_shed_repository_by_tool_shed_name_owner_changeset_revision( self, tool_shed_url, name, owner, changeset_revision ):
sa_session = self.app.install_model.context
# The protocol is not stored, but the port is if it exists.
@@ -352,13 +446,12 @@
can_install_tool_dependency = True
else:
# Notice that we'll throw away the following tool_dependency if it can be installed.
- tool_dependency, can_install_tool_dependency = \
- tool_dependency_util.sync_database_with_file_system( self.app,
- tool_shed_repository,
- package_name,
- package_version,
- dependent_install_dir,
- tool_dependency_type='package' )
+ tool_dependency, can_install_tool_dependency = self.sync_database_with_file_system( self.app,
+ tool_shed_repository,
+ package_name,
+ package_version,
+ dependent_install_dir,
+ tool_dependency_type='package' )
if not can_install_tool_dependency:
log.debug( "Tool dependency %s version %s cannot be installed (it was probably previously installed), " % \
( str( tool_dependency.name, str( tool_dependency.version ) ) ) )
diff -r 3232e68101a46276f07c1f545ccf37c55552f66a -r 5af80141674f5ac3166b6782385c408258606d0a lib/tool_shed/util/metadata_util.py
--- a/lib/tool_shed/util/metadata_util.py
+++ b/lib/tool_shed/util/metadata_util.py
@@ -3,7 +3,6 @@
import tempfile
from galaxy import util
-from galaxy import web
from galaxy.datatypes import checkers
from galaxy.model.orm import and_
from galaxy.tools.data_manager.manager import DataManager
@@ -11,17 +10,17 @@
from galaxy.util import json
from galaxy.web import url_for
-import tool_shed.util.shed_util_common as suc
from tool_shed.repository_types.metadata import TipOnly
+import tool_shed.repository_types.util as rt_util
+
from tool_shed.util import basic_util
from tool_shed.util import common_util
-from tool_shed.util import container_util
from tool_shed.util import hg_util
from tool_shed.util import readme_util
+from tool_shed.util import shed_util_common as suc
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
from tool_shed.util import xml_util
-import tool_shed.repository_types.util as rt_util
log = logging.getLogger( __name__ )
@@ -1013,7 +1012,11 @@
description = root.get( 'description' )
for elem in root:
if elem.tag == 'package':
- valid_tool_dependencies_dict, invalid_tool_dependencies_dict, repository_dependency_tup, repository_dependency_is_valid, message = \
+ valid_tool_dependencies_dict, \
+ invalid_tool_dependencies_dict, \
+ repository_dependency_tup, \
+ repository_dependency_is_valid, \
+ message = \
generate_package_dependency_metadata( app, elem, valid_tool_dependencies_dict, invalid_tool_dependencies_dict )
if repository_dependency_is_valid:
if repository_dependency_tup and repository_dependency_tup not in valid_repository_dependency_tups:
@@ -1024,7 +1027,13 @@
# We have an invalid complex repository dependency, so mark the tool dependency as invalid.
tool_dependency_is_valid = False
# Append the error message to the invalid repository dependency tuple.
- toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = repository_dependency_tup
+ toolshed, \
+ name, \
+ owner, \
+ changeset_revision, \
+ prior_installation_required, \
+ only_if_compiling_contained_td \
+ = repository_dependency_tup
repository_dependency_tup = \
( toolshed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td, message )
invalid_repository_dependency_tups.append( repository_dependency_tup )
@@ -1033,9 +1042,13 @@
valid_tool_dependencies_dict = generate_environment_dependency_metadata( elem, valid_tool_dependencies_dict )
if valid_tool_dependencies_dict:
if original_valid_tool_dependencies_dict:
- # We're generating metadata on an update pulled to a tool shed repository installed into a Galaxy instance, so handle changes to
- # tool dependencies appropriately.
- handle_existing_tool_dependencies_that_changed_in_update( app, repository, original_valid_tool_dependencies_dict, valid_tool_dependencies_dict )
+ # We're generating metadata on an update pulled to a tool shed repository installed
+ # into a Galaxy instance, so handle changes to tool dependencies appropriately.
+ irm = app.installed_repository_manager
+ updated_tool_dependency_names, deleted_tool_dependency_names = \
+ irm.handle_existing_tool_dependencies_that_changed_in_update( repository,
+ original_valid_tool_dependencies_dict,
+ valid_tool_dependencies_dict )
metadata_dict[ 'tool_dependencies' ] = valid_tool_dependencies_dict
if invalid_tool_dependencies_dict:
metadata_dict[ 'invalid_tool_dependencies' ] = invalid_tool_dependencies_dict
@@ -1296,25 +1309,6 @@
sample_file_metadata_paths.append( relative_path_to_sample_file )
return sample_file_metadata_paths, sample_file_copy_paths
-def handle_existing_tool_dependencies_that_changed_in_update( app, repository, original_dependency_dict, new_dependency_dict ):
- """
- This method is called when a Galaxy admin is getting updates for an installed tool shed repository in order to cover the case where an
- existing tool dependency was changed (e.g., the version of the dependency was changed) but the tool version for which it is a dependency
- was not changed. In this case, we only want to determine if any of the dependency information defined in original_dependency_dict was
- changed in new_dependency_dict. We don't care if new dependencies were added in new_dependency_dict since they will just be treated as
- missing dependencies for the tool.
- """
- updated_tool_dependency_names = []
- deleted_tool_dependency_names = []
- for original_dependency_key, original_dependency_val_dict in original_dependency_dict.items():
- if original_dependency_key not in new_dependency_dict:
- updated_tool_dependency = update_existing_tool_dependency( app, repository, original_dependency_val_dict, new_dependency_dict )
- if updated_tool_dependency:
- updated_tool_dependency_names.append( updated_tool_dependency.name )
- else:
- deleted_tool_dependency_names.append( original_dependency_val_dict[ 'name' ] )
- return updated_tool_dependency_names, deleted_tool_dependency_names
-
def handle_repository_elem( app, repository_elem, only_if_compiling_contained_td=False, updating_installed_repository=False ):
"""
Process the received repository_elem which is a <repository> tag either from a repository_dependencies.xml
@@ -1737,104 +1731,6 @@
# The received metadata_dict includes no metadata for workflows, so a new repository_metadata table record is not needed.
return False
-def populate_containers_dict_from_repository_metadata( app, tool_shed_url, tool_path, repository,
- reinstalling=False, required_repo_info_dicts=None ):
- """
- Retrieve necessary information from the received repository's metadata to populate the
- containers_dict for display. This method is called only from Galaxy (not the tool shed)
- when displaying repository dependencies for installed repositories and when displaying
- them for uninstalled repositories that are being reinstalled.
- """
- metadata = repository.metadata
- if metadata:
- # Handle proprietary datatypes.
- datatypes = metadata.get( 'datatypes', None )
- # Handle invalid tools.
- invalid_tools = metadata.get( 'invalid_tools', None )
- # Handle README files.
- if repository.has_readme_files:
- if reinstalling or repository.status not in [ app.install_model.ToolShedRepository.installation_status.DEACTIVATED,
- app.install_model.ToolShedRepository.installation_status.INSTALLED ]:
- # Since we're reinstalling, we need to send a request to the tool shed to get the README files.
- tool_shed_url = common_util.get_tool_shed_url_from_tool_shed_registry( app, tool_shed_url )
- params = '?name=%s&owner=%s&changeset_revision=%s' % ( str( repository.name ),
- str( repository.owner ),
- str( repository.installed_changeset_revision ) )
- url = common_util.url_join( tool_shed_url,
- 'repository/get_readme_files%s' % params )
- raw_text = common_util.tool_shed_get( app, tool_shed_url, url )
- readme_files_dict = json.from_json_string( raw_text )
- else:
- readme_files_dict = readme_util.build_readme_files_dict( app,
- repository,
- repository.changeset_revision,
- repository.metadata, tool_path )
- else:
- readme_files_dict = None
- # Handle repository dependencies.
- installed_repository_dependencies, missing_repository_dependencies = \
- app.installed_repository_manager.get_installed_and_missing_repository_dependencies( repository )
- # Handle the current repository's tool dependencies.
- repository_tool_dependencies = metadata.get( 'tool_dependencies', None )
- # Make sure to display missing tool dependencies as well.
- repository_invalid_tool_dependencies = metadata.get( 'invalid_tool_dependencies', None )
- if repository_invalid_tool_dependencies is not None:
- if repository_tool_dependencies is None:
- repository_tool_dependencies = {}
- repository_tool_dependencies.update( repository_invalid_tool_dependencies )
- repository_installed_tool_dependencies, repository_missing_tool_dependencies = \
- tool_dependency_util.get_installed_and_missing_tool_dependencies_for_installed_repository( app,
- repository,
- repository_tool_dependencies )
- if reinstalling:
- installed_tool_dependencies, missing_tool_dependencies = \
- tool_dependency_util.populate_tool_dependencies_dicts( app,
- tool_shed_url,
- tool_path,
- repository_installed_tool_dependencies,
- repository_missing_tool_dependencies,
- required_repo_info_dicts )
- else:
- installed_tool_dependencies = repository_installed_tool_dependencies
- missing_tool_dependencies = repository_missing_tool_dependencies
- # Handle valid tools.
- valid_tools = metadata.get( 'tools', None )
- # Handle workflows.
- workflows = metadata.get( 'workflows', None )
- # Handle Data Managers
- valid_data_managers = None
- invalid_data_managers = None
- data_managers_errors = None
- if 'data_manager' in metadata:
- valid_data_managers = metadata['data_manager'].get( 'data_managers', None )
- invalid_data_managers = metadata['data_manager'].get( 'invalid_data_managers', None )
- data_managers_errors = metadata['data_manager'].get( 'messages', None )
- containers_dict = container_util.build_repository_containers_for_galaxy( app=app,
- repository=repository,
- datatypes=datatypes,
- invalid_tools=invalid_tools,
- missing_repository_dependencies=missing_repository_dependencies,
- missing_tool_dependencies=missing_tool_dependencies,
- readme_files_dict=readme_files_dict,
- repository_dependencies=installed_repository_dependencies,
- tool_dependencies=installed_tool_dependencies,
- valid_tools=valid_tools,
- workflows=workflows,
- valid_data_managers=valid_data_managers,
- invalid_data_managers=invalid_data_managers,
- data_managers_errors=data_managers_errors,
- new_install=False,
- reinstalling=reinstalling )
- else:
- containers_dict = dict( datatypes=None,
- invalid_tools=None,
- readme_files_dict=None,
- repository_dependencies=None,
- tool_dependencies=None,
- valid_tools=None,
- workflows=None )
- return containers_dict
-
def reset_all_metadata_on_installed_repository( app, id ):
"""Reset all metadata on a single tool shed repository installed into a Galaxy instance."""
invalid_file_tups = []
@@ -2201,68 +2097,6 @@
error_message, status = set_repository_metadata( app, host, user, repository, content_alert_str=content_alert_str, **kwd )
return status, error_message
-def update_existing_tool_dependency( app, repository, original_dependency_dict, new_dependencies_dict ):
- """
- Update an exsiting tool dependency whose definition was updated in a change set
- pulled by a Galaxy administrator when getting updates to an installed tool shed
- repository. The original_dependency_dict is a single tool dependency definition,
- an example of which is::
-
- {"name": "bwa",
- "readme": "\\nCompiling BWA requires zlib and libpthread to be present on your system.\\n ",
- "type": "package",
- "version": "0.6.2"}
-
- The new_dependencies_dict is the dictionary generated by the metadata_util.generate_tool_dependency_metadata method.
- """
- new_tool_dependency = None
- original_name = original_dependency_dict[ 'name' ]
- original_type = original_dependency_dict[ 'type' ]
- original_version = original_dependency_dict[ 'version' ]
- # Locate the appropriate tool_dependency associated with the repository.
- tool_dependency = None
- for tool_dependency in repository.tool_dependencies:
- if tool_dependency.name == original_name and tool_dependency.type == original_type and tool_dependency.version == original_version:
- break
- if tool_dependency and tool_dependency.can_update:
- dependency_install_dir = tool_dependency.installation_directory( app )
- removed_from_disk, error_message = tool_dependency_util.remove_tool_dependency_installation_directory( dependency_install_dir )
- if removed_from_disk:
- context = app.install_model.context
- new_dependency_name = None
- new_dependency_type = None
- new_dependency_version = None
- for new_dependency_key, new_dependency_val_dict in new_dependencies_dict.items():
- # Match on name only, hopefully this will be enough!
- if original_name == new_dependency_val_dict[ 'name' ]:
- new_dependency_name = new_dependency_val_dict[ 'name' ]
- new_dependency_type = new_dependency_val_dict[ 'type' ]
- new_dependency_version = new_dependency_val_dict[ 'version' ]
- break
- if new_dependency_name and new_dependency_type and new_dependency_version:
- # Update all attributes of the tool_dependency record in the database.
- log.debug( "Updating version %s of tool dependency %s %s to have new version %s and type %s." % \
- ( str( tool_dependency.version ),
- str( tool_dependency.type ),
- str( tool_dependency.name ),
- str( new_dependency_version ),
- str( new_dependency_type ) ) )
- tool_dependency.type = new_dependency_type
- tool_dependency.version = new_dependency_version
- tool_dependency.status = app.install_model.ToolDependency.installation_status.UNINSTALLED
- tool_dependency.error_message = None
- context.add( tool_dependency )
- context.flush()
- new_tool_dependency = tool_dependency
- else:
- # We have no new tool dependency definition based on a matching dependency name, so remove
- # the existing tool dependency record from the database.
- log.debug( "Deleting version %s of tool dependency %s %s from the database since it is no longer defined." % \
- ( str( tool_dependency.version ), str( tool_dependency.type ), str( tool_dependency.name ) ) )
- context.delete( tool_dependency )
- context.flush()
- return new_tool_dependency
-
def update_repository_dependencies_metadata( metadata, repository_dependency_tups, is_valid, description ):
if is_valid:
repository_dependencies_dict = metadata.get( 'repository_dependencies', None )
This diff is so big that we needed to truncate the remainder.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: davebgx: Make the tool shed's database migrations compatible with MySQL.
by commits-noreply@bitbucket.org 16 Jul '14
by commits-noreply@bitbucket.org 16 Jul '14
16 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/3232e68101a4/
Changeset: 3232e68101a4
User: davebgx
Date: 2014-07-16 18:50:15
Summary: Make the tool shed's database migrations compatible with MySQL.
Affected #: 1 file
diff -r 10c58feed93f26d8247be96baa111b3408970784 -r 3232e68101a46276f07c1f545ccf37c55552f66a lib/galaxy/webapps/tool_shed/model/migrate/versions/0001_initial_tables.py
--- a/lib/galaxy/webapps/tool_shed/model/migrate/versions/0001_initial_tables.py
+++ b/lib/galaxy/webapps/tool_shed/model/migrate/versions/0001_initial_tables.py
@@ -143,12 +143,13 @@
Column( "id", Integer, primary_key=True ),
Column( "tool_id", Integer, ForeignKey( "tool.id" ), index=True ),
Column( "user_id", Integer, ForeignKey( "galaxy_user.id" ), index=True ),
- Column( "annotation", TEXT, index=True) )
+ Column( "annotation", TEXT ) )
def upgrade( migrate_engine ):
print __doc__
metadata.bind = migrate_engine
metadata.create_all()
+ Index( 'ix_tool_annotation_association_annotation', ToolAnnotationAssociation_table.c.annotation, mysql_length=767 ).create()
def downgrade( migrate_engine ):
# Operations to reverse the above upgrade go here.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Rename the ~/tool_shed/dependencies/dependency_manager.py module to be attribute_handlers.py
by commits-noreply@bitbucket.org 16 Jul '14
by commits-noreply@bitbucket.org 16 Jul '14
16 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/10c58feed93f/
Changeset: 10c58feed93f
User: greg
Date: 2014-07-16 15:16:35
Summary: Rename the ~/tool_shed/dependencies/dependency_manager.py module to be attribute_handlers.py
Affected #: 4 files
diff -r 6a548e4e607f57942bc874ab10eb738522d5e9d9 -r 10c58feed93f26d8247be96baa111b3408970784 lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -8,7 +8,7 @@
from galaxy import util
from galaxy import web
from galaxy.datatypes import checkers
-from tool_shed.dependencies import dependency_manager
+from tool_shed.dependencies import attribute_handlers
import tool_shed.repository_types.util as rt_util
from tool_shed.util import basic_util
from tool_shed.util import commit_util
@@ -95,8 +95,8 @@
uploaded_file_filename = os.path.split( file_data.filename )[ -1 ]
isempty = os.path.getsize( os.path.abspath( uploaded_file_name ) ) == 0
if uploaded_file or uploaded_directory:
- rdah = dependency_manager.RepositoryDependencyAttributeHandler( trans.app, unpopulate=False )
- tdah = dependency_manager.ToolDependencyAttributeHandler( trans.app, unpopulate=False )
+ rdah = attribute_handlers.RepositoryDependencyAttributeHandler( trans.app, unpopulate=False )
+ tdah = attribute_handlers.ToolDependencyAttributeHandler( trans.app, unpopulate=False )
ok = True
isgzip = False
isbz2 = False
diff -r 6a548e4e607f57942bc874ab10eb738522d5e9d9 -r 10c58feed93f26d8247be96baa111b3408970784 lib/tool_shed/capsule/capsule_manager.py
--- a/lib/tool_shed/capsule/capsule_manager.py
+++ b/lib/tool_shed/capsule/capsule_manager.py
@@ -13,8 +13,7 @@
from galaxy.util import CHUNK_SIZE
from galaxy.util.odict import odict
from tool_shed.dependencies.repository.relation_builder import RelationBuilder
-from tool_shed.dependencies.dependency_manager import RepositoryDependencyAttributeHandler
-from tool_shed.dependencies.dependency_manager import ToolDependencyAttributeHandler
+from tool_shed.dependencies import attribute_handlers
from tool_shed.util import basic_util
from tool_shed.util import commit_util
from tool_shed.util import common_util
@@ -130,8 +129,8 @@
return sub_elements
def generate_repository_archive( self, repository, changeset_revision, work_dir ):
- rdah = RepositoryDependencyAttributeHandler( self.app, unpopulate=True )
- tdah = ToolDependencyAttributeHandler( self.app, unpopulate=True )
+ rdah = attribute_handlers.RepositoryDependencyAttributeHandler( self.app, unpopulate=True )
+ tdah = attribute_handlers.ToolDependencyAttributeHandler( self.app, unpopulate=True )
file_type_str = basic_util.get_file_type_str( changeset_revision, self.file_type )
file_name = '%s-%s' % ( repository.name, file_type_str )
return_code, error_message = hg_util.archive_repository_revision( self.app,
@@ -646,8 +645,8 @@
def import_repository_archive( self, repository, repository_archive_dict ):
"""Import a repository archive contained within a repository capsule."""
- rdah = RepositoryDependencyAttributeHandler( self.app, unpopulate=False )
- tdah = ToolDependencyAttributeHandler( self.app, unpopulate=False )
+ rdah = attribute_handlers.RepositoryDependencyAttributeHandler( self.app, unpopulate=False )
+ tdah = attribute_handlers.ToolDependencyAttributeHandler( self.app, unpopulate=False )
archive_file_name = repository_archive_dict.get( 'archive_file_name', None )
capsule_file_name = repository_archive_dict[ 'capsule_file_name' ]
encoded_file_path = repository_archive_dict[ 'encoded_file_path' ]
diff -r 6a548e4e607f57942bc874ab10eb738522d5e9d9 -r 10c58feed93f26d8247be96baa111b3408970784 lib/tool_shed/dependencies/attribute_handlers.py
--- /dev/null
+++ b/lib/tool_shed/dependencies/attribute_handlers.py
@@ -0,0 +1,203 @@
+import copy
+import logging
+
+from galaxy.util import asbool
+from galaxy.util.odict import odict
+from galaxy.web import url_for
+
+from tool_shed.dependencies.tool import tag_attribute_handler
+from tool_shed.repository_types.util import REPOSITORY_DEPENDENCY_DEFINITION_FILENAME
+from tool_shed.repository_types.util import TOOL_DEPENDENCY_DEFINITION_FILENAME
+from tool_shed.util import hg_util
+from tool_shed.util import shed_util_common as suc
+from tool_shed.util import xml_util
+log = logging.getLogger( __name__ )
+
+
+class RepositoryDependencyAttributeHandler( object ):
+
+ def __init__( self, app, unpopulate ):
+ self.app = app
+ self.file_name = REPOSITORY_DEPENDENCY_DEFINITION_FILENAME
+ self.unpopulate = unpopulate
+
+ def check_tag_attributes( self, elem ):
+ # <repository name="molecule_datatypes" owner="test" />
+ error_message = ''
+ name = elem.get( 'name' )
+ if not name:
+ error_message += 'The tag is missing the required name attribute. '
+ owner = elem.get( 'owner' )
+ if not owner:
+ error_message += 'The tag is missing the required owner attribute. '
+ log.debug( error_message )
+ return error_message
+
+ def handle_complex_dependency_elem( self, parent_elem, elem_index, elem ):
+ """
+ Populate or unpopulate the toolshed and changeset_revision attributes of a
+ <repository> tag that defines a complex repository dependency.
+ """
+ # <repository name="package_eigen_2_0" owner="test" prior_installation_required="True" />
+ altered, new_elem, error_message = self.handle_elem( elem )
+ if error_message:
+ error_message += ' The %s file contains an invalid <repository> tag.' % TOOL_DEPENDENCY_DEFINITION_FILENAME
+ return altered, new_elem, error_message
+
+ def handle_elem( self, elem ):
+ """Populate or unpopulate the changeset_revision and toolshed attributes of repository tags."""
+ # <repository name="molecule_datatypes" owner="test" changeset_revision="1a070566e9c6" />
+ # <repository changeset_revision="xxx" name="package_xorg_macros_1_17_1" owner="test" toolshed="yyy">
+ # <package name="xorg_macros" version="1.17.1" />
+ # </repository>
+ error_message = ''
+ name = elem.get( 'name' )
+ owner = elem.get( 'owner' )
+ # The name and owner attributes are always required, so if either are missing, return the error message.
+ if not name or not owner:
+ error_message = self.check_tag_attributes( elem )
+ return False, elem, error_message
+ altered = False
+ toolshed = elem.get( 'toolshed' )
+ changeset_revision = elem.get( 'changeset_revision' )
+ # Over a short period of time a bug existed which caused the prior_installation_required attribute
+ # to be set to False and included in the <repository> tag when a repository was exported along with
+ # its dependencies. The following will eliminate this problematic attribute upon import.
+ prior_installation_required = elem.get( 'prior_installation_required' )
+ if prior_installation_required is not None and not asbool( prior_installation_required ):
+ del elem.attrib[ 'prior_installation_required' ]
+ sub_elems = [ child_elem for child_elem in list( elem ) ]
+ if len( sub_elems ) > 0:
+ # At this point, a <repository> tag will point only to a package.
+ # <package name="xorg_macros" version="1.17.1" />
+ # Coerce the list to an odict().
+ sub_elements = odict()
+ packages = []
+ for sub_elem in sub_elems:
+ sub_elem_type = sub_elem.tag
+ sub_elem_name = sub_elem.get( 'name' )
+ sub_elem_version = sub_elem.get( 'version' )
+ if sub_elem_type and sub_elem_name and sub_elem_version:
+ packages.append( ( sub_elem_name, sub_elem_version ) )
+ sub_elements[ 'packages' ] = packages
+ else:
+ # Set to None.
+ sub_elements = None
+ if self.unpopulate:
+ # We're exporting the repository, so eliminate all toolshed and changeset_revision attributes
+ # from the <repository> tag.
+ if toolshed or changeset_revision:
+ attributes = odict()
+ attributes[ 'name' ] = name
+ attributes[ 'owner' ] = owner
+ prior_installation_required = elem.get( 'prior_installation_required' )
+ if asbool( prior_installation_required ):
+ attributes[ 'prior_installation_required' ] = 'True'
+ new_elem = xml_util.create_element( 'repository', attributes=attributes, sub_elements=sub_elements )
+ altered = True
+ return altered, new_elem, error_message
+ # From here on we're populating the toolshed and changeset_revision attributes if necessary.
+ if not toolshed:
+ # Default the setting to the current tool shed.
+ toolshed = str( url_for( '/', qualified=True ) ).rstrip( '/' )
+ elem.attrib[ 'toolshed' ] = toolshed
+ altered = True
+ if not changeset_revision:
+ # Populate the changeset_revision attribute with the latest installable metadata revision for
+ # the defined repository. We use the latest installable revision instead of the latest metadata
+ # revision to ensure that the contents of the revision are valid.
+ repository = suc.get_repository_by_name_and_owner( self.app, name, owner )
+ if repository:
+ repo = hg_util.get_repo_for_repository( self.app,
+ repository=repository,
+ repo_path=None,
+ create=False )
+ lastest_installable_changeset_revision = \
+ suc.get_latest_downloadable_changeset_revision( self.app, repository, repo )
+ if lastest_installable_changeset_revision != hg_util.INITIAL_CHANGELOG_HASH:
+ elem.attrib[ 'changeset_revision' ] = lastest_installable_changeset_revision
+ altered = True
+ else:
+ error_message = 'Invalid latest installable changeset_revision %s ' % \
+ str( lastest_installable_changeset_revision )
+ error_message += 'retrieved for repository %s owned by %s. ' % ( str( name ), str( owner ) )
+ else:
+ error_message = 'Unable to locate repository with name %s and owner %s. ' % ( str( name ), str( owner ) )
+ return altered, elem, error_message
+
+ def handle_sub_elem( self, parent_elem, elem_index, elem ):
+ """
+ Populate or unpopulate the toolshed and changeset_revision attributes for each of
+ the following tag sets.
+ <action type="set_environment_for_install">
+ <action type="setup_r_environment">
+ <action type="setup_ruby_environment">
+ """
+ sub_elem_altered = False
+ error_message = ''
+ for sub_index, sub_elem in enumerate( elem ):
+ # Make sure to skip comments and tags that are not <repository>.
+ if sub_elem.tag == 'repository':
+ altered, new_sub_elem, message = self.handle_elem( sub_elem )
+ if message:
+ error_message += 'The %s file contains an invalid <repository> tag. %s' % \
+ ( TOOL_DEPENDENCY_DEFINITION_FILENAME, message )
+ if altered:
+ if not sub_elem_altered:
+ sub_elem_altered = True
+ elem[ sub_index ] = new_sub_elem
+ if sub_elem_altered:
+ parent_elem[ elem_index ] = elem
+ return sub_elem_altered, parent_elem, error_message
+
+ def handle_tag_attributes( self, config ):
+ """
+ Populate or unpopulate the toolshed and changeset_revision attributes of a
+ <repository> tag. Populating will occur when a dependency definition file
+ is being uploaded to the repository, while unpopulating will occur when the
+ repository is being exported.
+ """
+ # Make sure we're looking at a valid repository_dependencies.xml file.
+ tree, error_message = xml_util.parse_xml( config )
+ if tree is None:
+ return False, None, error_message
+ root = tree.getroot()
+ root_altered = False
+ new_root = copy.deepcopy( root )
+ for index, elem in enumerate( root ):
+ if elem.tag == 'repository':
+ # <repository name="molecule_datatypes" owner="test" changeset_revision="1a070566e9c6" />
+ altered, new_elem, error_message = self.handle_elem( elem )
+ if error_message:
+ error_message = 'The %s file contains an invalid <repository> tag. %s' % ( self.file_name, error_message )
+ return False, None, error_message
+ if altered:
+ if not root_altered:
+ root_altered = True
+ new_root[ index ] = new_elem
+ return root_altered, new_root, error_message
+
+
+class ToolDependencyAttributeHandler( object ):
+
+ def __init__( self, app, unpopulate ):
+ self.app = app
+ self.file_name = TOOL_DEPENDENCY_DEFINITION_FILENAME
+ self.unpopulate = unpopulate
+
+ def handle_tag_attributes( self, tool_dependencies_config ):
+ """
+ Populate or unpopulate the tooshed and changeset_revision attributes of each <repository>
+ tag defined within a tool_dependencies.xml file.
+ """
+ rdah = RepositoryDependencyAttributeHandler( self.app, self.unpopulate )
+ tah = tag_attribute_handler.TagAttributeHandler( self.app, rdah, self.unpopulate )
+ altered = False
+ error_message = ''
+ # Make sure we're looking at a valid tool_dependencies.xml file.
+ tree, error_message = xml_util.parse_xml( tool_dependencies_config )
+ if tree is None:
+ return False, None, error_message
+ root = tree.getroot()
+ altered, new_root, error_message = tah.process_config( root )
+ return altered, new_root, error_message
diff -r 6a548e4e607f57942bc874ab10eb738522d5e9d9 -r 10c58feed93f26d8247be96baa111b3408970784 lib/tool_shed/dependencies/dependency_manager.py
--- a/lib/tool_shed/dependencies/dependency_manager.py
+++ /dev/null
@@ -1,203 +0,0 @@
-import copy
-import logging
-
-from galaxy.util import asbool
-from galaxy.util.odict import odict
-from galaxy.web import url_for
-
-from tool_shed.dependencies.tool import tag_attribute_handler
-from tool_shed.repository_types.util import REPOSITORY_DEPENDENCY_DEFINITION_FILENAME
-from tool_shed.repository_types.util import TOOL_DEPENDENCY_DEFINITION_FILENAME
-from tool_shed.util import hg_util
-from tool_shed.util import shed_util_common as suc
-from tool_shed.util import xml_util
-log = logging.getLogger( __name__ )
-
-
-class RepositoryDependencyAttributeHandler( object ):
-
- def __init__( self, app, unpopulate ):
- self.app = app
- self.file_name = REPOSITORY_DEPENDENCY_DEFINITION_FILENAME
- self.unpopulate = unpopulate
-
- def check_tag_attributes( self, elem ):
- # <repository name="molecule_datatypes" owner="test" />
- error_message = ''
- name = elem.get( 'name' )
- if not name:
- error_message += 'The tag is missing the required name attribute. '
- owner = elem.get( 'owner' )
- if not owner:
- error_message += 'The tag is missing the required owner attribute. '
- log.debug( error_message )
- return error_message
-
- def handle_complex_dependency_elem( self, parent_elem, elem_index, elem ):
- """
- Populate or unpopulate the toolshed and changeset_revision attributes of a
- <repository> tag that defines a complex repository dependency.
- """
- # <repository name="package_eigen_2_0" owner="test" prior_installation_required="True" />
- altered, new_elem, error_message = self.handle_elem( elem )
- if error_message:
- error_message += ' The %s file contains an invalid <repository> tag.' % TOOL_DEPENDENCY_DEFINITION_FILENAME
- return altered, new_elem, error_message
-
- def handle_elem( self, elem ):
- """Populate or unpopulate the changeset_revision and toolshed attributes of repository tags."""
- # <repository name="molecule_datatypes" owner="test" changeset_revision="1a070566e9c6" />
- # <repository changeset_revision="xxx" name="package_xorg_macros_1_17_1" owner="test" toolshed="yyy">
- # <package name="xorg_macros" version="1.17.1" />
- # </repository>
- error_message = ''
- name = elem.get( 'name' )
- owner = elem.get( 'owner' )
- # The name and owner attributes are always required, so if either are missing, return the error message.
- if not name or not owner:
- error_message = self.check_tag_attributes( elem )
- return False, elem, error_message
- altered = False
- toolshed = elem.get( 'toolshed' )
- changeset_revision = elem.get( 'changeset_revision' )
- # Over a short period of time a bug existed which caused the prior_installation_required attribute
- # to be set to False and included in the <repository> tag when a repository was exported along with
- # its dependencies. The following will eliminate this problematic attribute upon import.
- prior_installation_required = elem.get( 'prior_installation_required' )
- if prior_installation_required is not None and not asbool( prior_installation_required ):
- del elem.attrib[ 'prior_installation_required' ]
- sub_elems = [ child_elem for child_elem in list( elem ) ]
- if len( sub_elems ) > 0:
- # At this point, a <repository> tag will point only to a package.
- # <package name="xorg_macros" version="1.17.1" />
- # Coerce the list to an odict().
- sub_elements = odict()
- packages = []
- for sub_elem in sub_elems:
- sub_elem_type = sub_elem.tag
- sub_elem_name = sub_elem.get( 'name' )
- sub_elem_version = sub_elem.get( 'version' )
- if sub_elem_type and sub_elem_name and sub_elem_version:
- packages.append( ( sub_elem_name, sub_elem_version ) )
- sub_elements[ 'packages' ] = packages
- else:
- # Set to None.
- sub_elements = None
- if self.unpopulate:
- # We're exporting the repository, so eliminate all toolshed and changeset_revision attributes
- # from the <repository> tag.
- if toolshed or changeset_revision:
- attributes = odict()
- attributes[ 'name' ] = name
- attributes[ 'owner' ] = owner
- prior_installation_required = elem.get( 'prior_installation_required' )
- if asbool( prior_installation_required ):
- attributes[ 'prior_installation_required' ] = 'True'
- new_elem = xml_util.create_element( 'repository', attributes=attributes, sub_elements=sub_elements )
- altered = True
- return altered, new_elem, error_message
- # From here on we're populating the toolshed and changeset_revision attributes if necessary.
- if not toolshed:
- # Default the setting to the current tool shed.
- toolshed = str( url_for( '/', qualified=True ) ).rstrip( '/' )
- elem.attrib[ 'toolshed' ] = toolshed
- altered = True
- if not changeset_revision:
- # Populate the changeset_revision attribute with the latest installable metadata revision for
- # the defined repository. We use the latest installable revision instead of the latest metadata
- # revision to ensure that the contents of the revision are valid.
- repository = suc.get_repository_by_name_and_owner( self.app, name, owner )
- if repository:
- repo = hg_util.get_repo_for_repository( self.app,
- repository=repository,
- repo_path=None,
- create=False )
- lastest_installable_changeset_revision = \
- suc.get_latest_downloadable_changeset_revision( self.app, repository, repo )
- if lastest_installable_changeset_revision != hg_util.INITIAL_CHANGELOG_HASH:
- elem.attrib[ 'changeset_revision' ] = lastest_installable_changeset_revision
- altered = True
- else:
- error_message = 'Invalid latest installable changeset_revision %s ' % \
- str( lastest_installable_changeset_revision )
- error_message += 'retrieved for repository %s owned by %s. ' % ( str( name ), str( owner ) )
- else:
- error_message = 'Unable to locate repository with name %s and owner %s. ' % ( str( name ), str( owner ) )
- return altered, elem, error_message
-
- def handle_sub_elem( self, parent_elem, elem_index, elem ):
- """
- Populate or unpopulate the toolshed and changeset_revision attributes for each of
- the following tag sets.
- <action type="set_environment_for_install">
- <action type="setup_r_environment">
- <action type="setup_ruby_environment">
- """
- sub_elem_altered = False
- error_message = ''
- for sub_index, sub_elem in enumerate( elem ):
- # Make sure to skip comments and tags that are not <repository>.
- if sub_elem.tag == 'repository':
- altered, new_sub_elem, message = self.handle_elem( sub_elem )
- if message:
- error_message += 'The %s file contains an invalid <repository> tag. %s' % \
- ( TOOL_DEPENDENCY_DEFINITION_FILENAME, message )
- if altered:
- if not sub_elem_altered:
- sub_elem_altered = True
- elem[ sub_index ] = new_sub_elem
- if sub_elem_altered:
- parent_elem[ elem_index ] = elem
- return sub_elem_altered, parent_elem, error_message
-
- def handle_tag_attributes( self, config ):
- """
- Populate or unpopulate the toolshed and changeset_revision attributes of a
- <repository> tag. Populating will occur when a dependency definition file
- is being uploaded to the repository, while unpopulating will occur when the
- repository is being exported.
- """
- # Make sure we're looking at a valid repository_dependencies.xml file.
- tree, error_message = xml_util.parse_xml( config )
- if tree is None:
- return False, None, error_message
- root = tree.getroot()
- root_altered = False
- new_root = copy.deepcopy( root )
- for index, elem in enumerate( root ):
- if elem.tag == 'repository':
- # <repository name="molecule_datatypes" owner="test" changeset_revision="1a070566e9c6" />
- altered, new_elem, error_message = self.handle_elem( elem )
- if error_message:
- error_message = 'The %s file contains an invalid <repository> tag. %s' % ( self.file_name, error_message )
- return False, None, error_message
- if altered:
- if not root_altered:
- root_altered = True
- new_root[ index ] = new_elem
- return root_altered, new_root, error_message
-
-
-class ToolDependencyAttributeHandler( object ):
-
- def __init__( self, app, unpopulate ):
- self.app = app
- self.file_name = TOOL_DEPENDENCY_DEFINITION_FILENAME
- self.unpopulate = unpopulate
-
- def handle_tag_attributes( self, tool_dependencies_config ):
- """
- Populate or unpopulate the tooshed and changeset_revision attributes of each <repository>
- tag defined within a tool_dependencies.xml file.
- """
- rdah = RepositoryDependencyAttributeHandler( self.app, self.unpopulate )
- tah = tag_attribute_handler.TagAttributeHandler( self.app, rdah, self.unpopulate )
- altered = False
- error_message = ''
- # Make sure we're looking at a valid tool_dependencies.xml file.
- tree, error_message = xml_util.parse_xml( tool_dependencies_config )
- if tree is None:
- return False, None, error_message
- root = tree.getroot()
- altered, new_root, error_message = tah.process_config( root )
- return altered, new_root, error_message
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Visualization framework: fix bug in searching interval features.
by commits-noreply@bitbucket.org 15 Jul '14
by commits-noreply@bitbucket.org 15 Jul '14
15 Jul '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/6a548e4e607f/
Changeset: 6a548e4e607f
User: jgoecks
Date: 2014-07-15 23:04:33
Summary: Visualization framework: fix bug in searching interval features.
Affected #: 1 file
diff -r 70272fb2005161f34e13173df323b8e5363a9319 -r 6a548e4e607f57942bc874ab10eb738522d5e9d9 lib/galaxy/visualization/data_providers/genome.py
--- a/lib/galaxy/visualization/data_providers/genome.py
+++ b/lib/galaxy/visualization/data_providers/genome.py
@@ -93,7 +93,10 @@
else:
high = mid
+ # Need to move back one line because last line read may be included in
+ # results.
position = low * line_len
+ textloc_file.seek( position )
# At right point in file, generate hits.
result = []
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0