galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
August 2012
- 1 participants
- 118 discussions
commit/galaxy-central: greg: Fix for resetting all metadata on a tool shed repository.
by Bitbucket 31 Aug '12
by Bitbucket 31 Aug '12
31 Aug '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/ed1bc0c4770b/
changeset: ed1bc0c4770b
user: greg
date: 2012-08-31 22:04:46
summary: Fix for resetting all metadata on a tool shed repository.
affected #: 1 file
diff -r fab5274bf15a15ff7c710b60f8b521478c20f907 -r ed1bc0c4770b8059489f2fee15716dfcdc07feaa lib/galaxy/webapps/community/controllers/common.py
--- a/lib/galaxy/webapps/community/controllers/common.py
+++ b/lib/galaxy/webapps/community/controllers/common.py
@@ -868,6 +868,7 @@
metadata_dict = None
ancestor_changeset_revision = None
ancestor_metadata_dict = None
+ invalid_file_tups = []
home_dir = os.getcwd()
for changeset in repo.changelog:
work_dir = tempfile.mkdtemp()
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
31 Aug '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/fab5274bf15a/
changeset: fab5274bf15a
user: greg
date: 2012-08-31 21:25:48
summary: Display error message when unable to set metadata on a tool shed repository because of invalid tool configs. Also correct the way metadata is set on a tool shed repository when all repository files are deleted in a changeset.
affected #: 2 files
diff -r 0e3ecd0ea81876e4d32d108be1e02e393846e4e8 -r fab5274bf15a15ff7c710b60f8b521478c20f907 lib/galaxy/util/shed_util.py
--- a/lib/galaxy/util/shed_util.py
+++ b/lib/galaxy/util/shed_util.py
@@ -609,7 +609,10 @@
is_tool = False
if is_tool:
tool, valid, error_message = load_tool_from_config( app, full_path )
- if tool is not None:
+ if tool is None:
+ if not valid:
+ invalid_file_tups.append( ( name, error_message ) )
+ else:
invalid_files_and_errors_tups = check_tool_input_params( app, files_dir, name, tool, sample_file_metadata_paths, webapp=webapp )
can_set_metadata = True
for tup in invalid_files_and_errors_tups:
diff -r 0e3ecd0ea81876e4d32d108be1e02e393846e4e8 -r fab5274bf15a15ff7c710b60f8b521478c20f907 lib/galaxy/webapps/community/controllers/common.py
--- a/lib/galaxy/webapps/community/controllers/common.py
+++ b/lib/galaxy/webapps/community/controllers/common.py
@@ -921,9 +921,6 @@
ancestor_metadata_dict = None
elif ancestor_metadata_dict:
# We reach here only if current_metadata_dict is empty and ancestor_metadata_dict is not.
- ancestor_changeset_revision = current_changeset_revision
- metadata_changeset_revision = current_changeset_revision
- metadata_dict = ancestor_metadata_dict
if not ctx.children():
# We're at the end of the change log.
repository_metadata = create_or_update_repository_metadata( trans, id, repository, metadata_changeset_revision, metadata_dict )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Set attribute metadata for GFF and GFF3 in addition to GTF.
by Bitbucket 31 Aug '12
by Bitbucket 31 Aug '12
31 Aug '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/0e3ecd0ea818/
changeset: 0e3ecd0ea818
user: jgoecks
date: 2012-08-31 21:24:25
summary: Set attribute metadata for GFF and GFF3 in addition to GTF.
affected #: 1 file
diff -r 524e481298fa0977990d83bce92880caf05ceb06 -r 0e3ecd0ea81876e4d32d108be1e02e393846e4e8 lib/galaxy/datatypes/interval.py
--- a/lib/galaxy/datatypes/interval.py
+++ b/lib/galaxy/datatypes/interval.py
@@ -576,13 +576,58 @@
"""Add metadata elements"""
MetadataElement( name="columns", default=9, desc="Number of columns", readonly=True, visible=False )
MetadataElement( name="column_types", default=['str','str','str','int','int','int','str','str','str'], param=metadata.ColumnTypesParameter, desc="Column types", readonly=True, visible=False )
+
+ MetadataElement( name="attributes", default=0, desc="Number of attributes", readonly=True, visible=False, no_value=0 )
+ MetadataElement( name="attribute_types", default={}, desc="Attribute types", param=metadata.DictParameter, readonly=True, visible=False, no_value=[] )
def __init__( self, **kwd ):
"""Initialize datatype, by adding GBrowse display app"""
Tabular.__init__(self, **kwd)
self.add_display_app( 'ucsc', 'display at UCSC', 'as_ucsc_display_file', 'ucsc_links' )
self.add_display_app( 'gbrowse', 'display in Gbrowse', 'as_gbrowse_display_file', 'gbrowse_links' )
+
+ def set_attribute_metadata( self, dataset ):
+ """
+ Sets metadata elements for dataset's attributes.
+ """
+
+ # Use first N lines to set metadata for dataset attributes. Attributes
+ # not found in the first N lines will not have metadata.
+ num_lines = 200
+ attribute_types = {}
+ for i, line in enumerate( file ( dataset.file_name ) ):
+ if line and not line.startswith( '#' ):
+ elems = line.split( '\t' )
+ if len( elems ) == 9:
+ try:
+ # Loop through attributes to set types.
+ for name, value in parse_gff_attributes( elems[8] ).items():
+ # Default type is string.
+ value_type = "str"
+ try:
+ # Try int.
+ int( value )
+ value_type = "int"
+ except:
+ try:
+ # Try float.
+ float( value )
+ value_type = "float"
+ except:
+ pass
+ attribute_types[ name ] = value_type
+ except:
+ pass
+ if i + 1 == num_lines:
+ break
+
+ # Set attribute metadata and then set additional metadata.
+ dataset.metadata.attribute_types = attribute_types
+ dataset.metadata.attributes = len( attribute_types )
+
def set_meta( self, dataset, overwrite = True, **kwd ):
+ self.set_attribute_metadata( dataset )
+
i = 0
for i, line in enumerate( file ( dataset.file_name ) ):
line = line.rstrip('\r\n')
@@ -596,6 +641,7 @@
except:
pass
Tabular.set_meta( self, dataset, overwrite = overwrite, skip = i )
+
def display_peek( self, dataset ):
"""Returns formated html of peek"""
return Tabular.make_html_table( self, dataset, column_names=self.column_names )
@@ -756,6 +802,8 @@
"""Initialize datatype, by adding GBrowse display app"""
Gff.__init__(self, **kwd)
def set_meta( self, dataset, overwrite = True, **kwd ):
+ self.set_attribute_metadata( dataset )
+
i = 0
for i, line in enumerate( file ( dataset.file_name ) ):
line = line.rstrip('\r\n')
@@ -855,9 +903,6 @@
MetadataElement( name="columns", default=9, desc="Number of columns", readonly=True, visible=False )
MetadataElement( name="column_types", default=['str','str','str','int','int','float','str','int','list'], param=metadata.ColumnTypesParameter, desc="Column types", readonly=True, visible=False )
- MetadataElement( name="attributes", default=0, desc="Number of attributes", readonly=True, visible=False, no_value=0 )
- MetadataElement( name="attribute_types", default={}, desc="Attribute types", param=metadata.DictParameter, readonly=True, visible=False, no_value=[] )
-
def sniff( self, filename ):
"""
Determines whether the file is in gtf format
@@ -917,42 +962,6 @@
return True
except:
return False
-
- def set_meta( self, dataset, overwrite = True, **kwd ):
- # Use first N lines to set metadata for dataset attributes. Attributes
- # not found in the first N lines will not have metadata.
- num_lines = 200
- attribute_types = {}
- for i, line in enumerate( file ( dataset.file_name ) ):
- if line and not line.startswith( '#' ):
- elems = line.split( '\t' )
- if len( elems ) == 9:
- try:
- # Loop through attributes to set types.
- for name, value in parse_gff_attributes( elems[8] ).items():
- # Default type is string.
- value_type = "str"
- try:
- # Try int.
- int( value )
- value_type = "int"
- except:
- try:
- # Try float.
- float( value )
- value_type = "float"
- except:
- pass
- attribute_types[ name ] = value_type
- except:
- pass
- if i + 1 == num_lines:
- break
-
- # Set attribute metadata and then set additional metadata.
- dataset.metadata.attribute_types = attribute_types
- dataset.metadata.attributes = len( attribute_types )
- Gff.set_meta( self, dataset, overwrite = overwrite, skip = i )
def get_track_type( self ):
return "FeatureTrack", {"data": "interval_index", "index": "summary_tree"}
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
31 Aug '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/524e481298fa/
changeset: 524e481298fa
user: greg
date: 2012-08-31 20:03:40
summary: Add new repository tip column to tool shed repository grids, and differentiate between the repository tip revision and the repository's metadata revisions and installable revisions in various grids.
affected #: 2 files
diff -r b822f138cd9785243d20bf32c9e02d4dfda22d80 -r 524e481298fa0977990d83bce92880caf05ceb06 lib/galaxy/webapps/community/controllers/common.py
--- a/lib/galaxy/webapps/community/controllers/common.py
+++ b/lib/galaxy/webapps/community/controllers/common.py
@@ -514,6 +514,12 @@
else:
rev = '-1'
changeset_tups.append( ( rev, changeset_revision ) )
+ if len( changeset_tups ) == 1:
+ changeset_tup = changeset_tups[ 0 ]
+ current_changeset_revision = changeset_tup[ 1 ]
+ if current_changeset_revision == before_changeset_revision:
+ return INITIAL_CHANGELOG_HASH
+ return current_changeset_revision
previous_changeset_revision = None
current_changeset_revision = None
for changeset_tup in sorted( changeset_tups ):
diff -r b822f138cd9785243d20bf32c9e02d4dfda22d80 -r 524e481298fa0977990d83bce92880caf05ceb06 lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -9,9 +9,9 @@
from galaxy.web.framework.helpers import time_ago, iff, grids
from galaxy.util.json import from_json_string, to_json_string
from galaxy.model.orm import *
-from galaxy.util.shed_util import create_repo_info_dict, get_changectx_for_changeset, get_configured_ui, get_repository_file_contents, load_tool_from_config
-from galaxy.util.shed_util import NOT_TOOL_CONFIGS, open_repository_files_folder, reversed_lower_upper_bounded_changelog, reversed_upper_bounded_changelog
-from galaxy.util.shed_util import strip_path, to_html_escaped, update_repository, url_join
+from galaxy.util.shed_util import create_repo_info_dict, get_changectx_for_changeset, get_configured_ui, get_repository_file_contents, INITIAL_CHANGELOG_HASH
+from galaxy.util.shed_util import load_tool_from_config, NOT_TOOL_CONFIGS, open_repository_files_folder, reversed_lower_upper_bounded_changelog
+from galaxy.util.shed_util import reversed_upper_bounded_changelog, strip_path, to_html_escaped, update_repository, url_join
from galaxy.tool_shed.encoding_util import *
from common import *
@@ -109,14 +109,23 @@
class NameColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository ):
return repository.name
- class RevisionColumn( grids.GridColumn ):
+ class MetadataRevisionColumn( grids.GridColumn ):
def __init__( self, col_name ):
grids.GridColumn.__init__( self, col_name )
def get_value( self, trans, grid, repository ):
- """Display a SelectField whose options are the changeset_revision strings of all downloadable_revisions of this repository."""
+ """Display a SelectField whose options are the changeset_revision strings of all revisions of this repository."""
+ # A repository's metadata revisions may not all be installable, as some may contain only invalid tools.
select_field = build_changeset_revision_select_field( trans, repository, downloadable_only=False )
if len( select_field.options ) > 1:
return select_field.get_html()
+ elif len( select_field.options ) == 1:
+ return select_field.options[ 0 ][ 0 ]
+ return ''
+ class TipRevisionColumn( grids.GridColumn ):
+ def __init__( self, col_name ):
+ grids.GridColumn.__init__( self, col_name )
+ def get_value( self, trans, grid, repository ):
+ """Display the repository tip revision label."""
return repository.revision
class DescriptionColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository ):
@@ -169,7 +178,8 @@
DescriptionColumn( "Synopsis",
key="description",
attach_popup=False ),
- RevisionColumn( "Revision" ),
+ MetadataRevisionColumn( "Metadata Revisions" ),
+ TipRevisionColumn( "Tip Revision" ),
CategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
@@ -294,7 +304,9 @@
select_field = build_changeset_revision_select_field( trans, repository, downloadable_only=True )
if len( select_field.options ) > 1:
return select_field.get_html()
- return repository.revision
+ elif len( select_field.options ) == 1:
+ return select_field.options[ 0 ][ 0 ]
+ return ''
title = "Valid repositories"
columns = [
RepositoryListGrid.NameColumn( "Name",
@@ -303,7 +315,7 @@
RepositoryListGrid.DescriptionColumn( "Synopsis",
key="description",
attach_popup=False ),
- RevisionColumn( "Revision" ),
+ RevisionColumn( "Installable Revisions" ),
RepositoryListGrid.UserColumn( "Owner",
model_class=model.User,
attach_popup=False ),
@@ -657,11 +669,13 @@
if operation == "preview_tools_in_changeset":
repository_id = kwd.get( 'id', None )
repository = get_repository( trans, repository_id )
+ repository_metadata = get_latest_repository_metadata( trans, repository.id )
+ latest_installable_changeset_revision = repository_metadata.changeset_revision
return trans.response.send_redirect( web.url_for( controller='repository',
action='preview_tools_in_changeset',
webapp=webapp,
repository_id=repository_id,
- changeset_revision=repository.tip ) )
+ changeset_revision=latest_installable_changeset_revision ) )
elif operation == "valid_repositories_by_category":
# Eliminate the current filters if any exist.
for k, v in kwd.items():
@@ -1601,16 +1615,27 @@
selected_value=changeset_revision,
add_id_to_name=False,
downloadable_only=False )
- revision_label = get_revision_label( trans, repository, changeset_revision )
- repository_metadata = get_repository_metadata_by_changeset_revision( trans, id, changeset_revision )
- if repository_metadata:
- repository_metadata_id = trans.security.encode_id( repository_metadata.id )
- metadata = repository_metadata.metadata
- is_malicious = repository_metadata.malicious
- else:
- repository_metadata_id = None
- metadata = None
- is_malicious = False
+ revision_label = get_revision_label( trans, repository, repository.tip )
+ repository_metadata_id = None
+ metadata = None
+ is_malicious = False
+ if changeset_revision != INITIAL_CHANGELOG_HASH:
+ repository_metadata = get_repository_metadata_by_changeset_revision( trans, id, changeset_revision )
+ if repository_metadata:
+ revision_label = get_revision_label( trans, repository, changeset_revision )
+ repository_metadata_id = trans.security.encode_id( repository_metadata.id )
+ metadata = repository_metadata.metadata
+ is_malicious = repository_metadata.malicious
+ else:
+ # There is no repository_metadata defined for the changeset_revision, so see if it was defined in a previous changeset in the changelog.
+ previous_changeset_revision = get_previous_downloadable_changset_revision( repository, repo, changeset_revision )
+ if previous_changeset_revision != INITIAL_CHANGELOG_HASH:
+ repository_metadata = get_repository_metadata_by_changeset_revision( trans, id, previous_changeset_revision )
+ if repository_metadata:
+ revision_label = get_revision_label( trans, repository, previous_changeset_revision )
+ repository_metadata_id = trans.security.encode_id( repository_metadata.id )
+ metadata = repository_metadata.metadata
+ is_malicious = repository_metadata.malicious
if is_malicious:
if trans.app.security_agent.can_push( trans.user, repository ):
message += malicious_error_can_push
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/9beb21b505e9/
changeset: 9beb21b505e9
user: carlfeberhard
date: 2012-08-31 17:53:46
summary: (WIP) history panel, backbone.js conversion (includes alternate_history.mako as a temporary transition template)
affected #: 3 files
diff -r 01ed2f462dd7709876458b031d786d277d1f72f3 -r 9beb21b505e9485e27902d75bf342beea80c6b7e lib/galaxy/web/controllers/root.py
--- a/lib/galaxy/web/controllers/root.py
+++ b/lib/galaxy/web/controllers/root.py
@@ -119,7 +119,11 @@
show_deleted = show_purged = util.string_as_bool( show_deleted )
show_hidden = util.string_as_bool( show_hidden )
datasets = self.get_history_datasets( trans, history, show_deleted, show_hidden, show_purged )
- return trans.stream_template_mako( "root/history.mako",
+
+ # history panel -> backbone
+ history_panel_template = "root/history.mako"
+ #history_panel_template = "root/alternate_history.mako"
+ return trans.stream_template_mako( history_panel_template,
history = history,
annotation = self.get_item_annotation_str( trans.sa_session, trans.user, history ),
datasets = datasets,
@@ -508,3 +512,85 @@
@web.expose
def generate_error( self, trans ):
raise Exception( "Fake error!" )
+
+
+
+"""
+import galaxy.web.framework.helpers as web_helpers
+from functools import wraps
+def debug_mako_template( template_fn ):
+ # Wrap a function that produces a mako template for better debugging
+
+ # http://stackoverflow.com/questions/390409/how-do-you-debug-mako-templates
+ @wraps( template_fn )
+ def wrapper( *args, **kwargs ):
+ try:
+ log.debug( 'rendering template' )
+ return template_fn( *args, **kwargs )
+ log.debug( 'done rendering template' )
+ except Exception, ex:
+ log.error( "Mako Exception: " + str( ex ), exc_info=True )
+ return exceptions.html_error_template().render()
+ return wrapper
+
+def prep_dataset( hda, trans ):
+ states = trans.app.model.Dataset.states
+ print states
+ STATES_INTERPRETED_AS_QUEUED = [ 'no state', '', None ]
+ ## dataset is actually a HistoryDatasetAssociation
+ #ported mostly from history_common
+ #post: return dictionary form
+ #??: move out of templates?
+ #??: gather data from model?
+
+
+ #TODO: clean up magic strings
+
+ hda_dict = hda.get_api_value()
+ #TODO: use hda_dict.update to add these (instead of localvars)
+ def add_to_hda( **kwargs ):
+ hda_dict.update( kwargs )
+
+ # trans
+ encoded_hda_id = trans.security.encode_id( hda.id )
+ add_to_hda( id=encoded_hda_id )
+
+ add_to_hda( state=hda.state )
+ if hda.state in STATES_INTERPRETED_AS_QUEUED:
+ #TODO: magic string
+ add_to_hda( state='queued' )
+
+ # trans
+ current_user_roles = trans.get_current_user_roles()
+ add_to_hda( can_edit=( not ( hda.deleted or hda.purged ) ) )
+
+ # considered accessible if user can access or user isn't admin
+ # trans
+ accessible = trans.app.security_agent.can_access_dataset( current_user_roles, hda.dataset )
+ accessible = trans.user_is_admin() or accessible
+ add_to_hda( accessible=accessible )
+
+ #TODO: move urls into js galaxy_paths (decorate)
+ deleted = hda.deleted
+ purged = hda.purged
+ dataset_purged = hda.dataset.purged
+ if not ( dataset_purged or purged ) and for_editing:
+ undelete_url = web_helpers.url_for( controller='dataset', action='undelete', dataset_id=encoded_hda_id )
+ # trans
+ if trans.app.config.allow_user_dataset_purge:
+ purge_url = web_helpers.url_for( controller='dataset', action='purge', dataset_id=encoded_hda_id )
+
+ if not hda.visible:
+ unhide_url = web_helpers.url_for( controller='dataset', action='unhide', dataset_id=encoded_hda_id )
+
+
+ print 'dataset:', dataset
+
+ undelete_url = web_helpers.url_for( controller='dataset', action='undelete', dataset_id=encoded_hda_id )
+ purge_url = web_helpers.url_for( controller='dataset', action='purge', dataset_id=encoded_hda_id )
+ unhide_url = web_helpers.url_for( controller='dataset', action='unhide', dataset_id=encoded_hda_id )
+ print 'urls:', '\n'.join( undelete_url, purge_url, unhide_url )
+
+ return hda_dict
+
+"""
\ No newline at end of file
diff -r 01ed2f462dd7709876458b031d786d277d1f72f3 -r 9beb21b505e9485e27902d75bf342beea80c6b7e static/scripts/mvc/history.js
--- a/static/scripts/mvc/history.js
+++ b/static/scripts/mvc/history.js
@@ -1,145 +1,1060 @@
/*
-TODO:
+Backbone.js implementation of history panel
+
+TODO:
+ replicate then refactor (could be the wrong order)
+ History meta controls (rename, annotate, etc. - see history.js.120823.bak)
+ choose a templating system and use it consistently
+ HIview state transitions (eg. upload -> ok), curr: build new, delete old, place new (in render)
+ events (local/ui and otherwise)
+ widget building (popupmenu, etc.)
+ localization
+ incorporate relations
+ convert function comments to /** style
+ complete comments
+
as always: where does the model end and the view begin?
HistoryPanel
- HistoryCollection: (collection of histories: 'Saved Histories')
+ HistoryCollection: (collection of History: 'Saved Histories')
-CASES:
- logged-in/NOT
+ CASES:
+ logged-in/NOT
+ show-deleted/DONT
+
+ ?? anyway to _clone_ base HistoryItemView instead of creating a new one each time?
+
+ move inline styles into base.less
+ add classes, ids on empty divs
+ localized text from template
+ watch the magic strings
+
+ poly HistoryItemView on: can/cant_edit
*/
+_ = _;
+
//==============================================================================
-var HistoryItem = BaseModel.extend({
- // a single history structure
- // from: http://localhost:8080/api/histories/f2db41e1fa331b3e/contents/f2db41e1fa331…
- /*
- {
- "data_type": "fastq",
- "deleted": false,
- "download_url": "/datasets/f2db41e1fa331b3e/display?to_ext=fastq",
- "file_size": 226297533,
- "genome_build": "?",
- "id": "f2db41e1fa331b3e",
- "metadata_data_lines": null,
- "metadata_dbkey": "?",
- "metadata_sequences": null,
- "misc_blurb": "215.8 MB",
- "misc_info": "uploaded fastq file",
- "model_class": "HistoryDatasetAssociation",
- "name": "LTCF-2-19_GTGAAA_L001_R1_001.fastq",
- "state": "ok",
- "visible": true
+var Loggable = {
+ // replace null with console (if available) to see all logs
+ logger : null,
+
+ log : function(){
+ return ( this.logger )?( this.logger.debug.apply( this, arguments ) )
+ :( undefined );
}
- */
+};
+var LoggingModel = BaseModel.extend( Loggable );
+var LoggingView = BaseView.extend( Loggable );
+
+//==============================================================================
+//TODO: move to Galaxy obj./namespace, decorate for current page (as GalaxyPaths)
+var Localizable = {
+ localizedStrings : {},
+ setLocalizedString : function( str, localizedString ){
+ this.localizedStrings[ str ] = localizedString;
+ },
+ localize : function( str ){
+ if( str in this.localizedStrings ){ return this.localizedStrings[ str ]; }
+ return str;
+ }
+};
+var LocalizableView = LoggingView.extend( Localizable );
+//TODO: wire up to views
+
+//==============================================================================
+// jq plugin?
+//?? into template? I dunno: need to handle variadic keys, remove empty attrs (href="")
+//TODO: not happy with this (a 4th rendering/templating system!?) or it being global
+function linkHTMLTemplate( config, tag ){
+ // Create an anchor (or any tag) using any config params passed in
+ //NOTE!: send class attr as 'classes' to avoid res. keyword collision (jsLint)
+ if( !config ){ return '<a></a>'; }
+ tag = tag || 'a';
- display : function(){},
- edit_attr : function(){},
- delete : function(){},
- download : function(){},
- details : function(){},
- rerun : function(){},
- tags : function(){},
- annotations : function(){},
- peek : function(){},
+ var template = [ '<' + tag ];
+ for( key in config ){
+ var val = config[ key ];
+ if( val === '' ){ continue; }
+ switch( key ){
+ case 'text': continue;
+ case 'classes':
+ // handle keyword class which is also an HTML attr name
+ key = 'class';
+ val = ( config.classes.join )?( config.classes.join( ' ' ) ):( config.classes );
+ //note: lack of break (fall through)
+ default:
+ template.push( [ ' ', key, '="', val, '"' ].join( '' ) );
+ }
+ }
+ template.push( '>' );
+ if( 'text' in config ){ template.push( config.text ); }
+ template.push( '</' + tag + '>' );
+
+ return template.join( '' );
+}
+
+//==============================================================================
+var HistoryItem = LoggingModel.extend({
+ // a single HDA model
+
+ // uncomment this out see log messages
+ //logger : console,
+
+ defaults : {
+
+ id : null,
+ name : '',
+ data_type : null,
+ file_size : 0,
+ genome_build : null,
+ metadata_data_lines : 0,
+ metadata_dbkey : null,
+ metadata_sequences : 0,
+ misc_blurb : '',
+ misc_info : '',
+ model_class : '',
+ state : '',
+ deleted : false,
+ purged : false,
+
+ // clash with BaseModel here?
+ visible : true,
+
+ for_editing : true,
+ // additional urls will be passed and added, if permissions allow their use
+
+ bodyIsShown : false
+ },
+
+ initialize : function(){
+ this.log( this + '.initialize', this.attributes );
+ this.log( '\tparent history_id: ' + this.get( 'history_id' ) );
+ },
+
+ isEditable : function(){
+ // roughly can_edit from history_common.mako
+ return ( !( this.get( 'deleted' ) || this.get( 'purged' ) ) );
+ },
+
+ hasData : function(){
+ //TODO:?? is this equivalent to all possible hda.has_data calls?
+ return ( this.get( 'file_size' ) > 0 );
+ },
+
+ toString : function(){
+ return 'HistoryItem(' + ( this.get( 'name' ) || this.get( 'id' ) || '' ) + ')';
+ }
});
-//..............................................................................
-var HistoryItemView = BaseView.extend({
- // view for History model used in HistoryPanelView
+//------------------------------------------------------------------------------
+HistoryItem.STATES = {
+ NEW : 'new',
+ UPLOAD : 'upload',
+ QUEUED : 'queued',
+ RUNNING : 'running',
+ OK : 'ok',
+ EMPTY : 'empty',
+ ERROR : 'error',
+ DISCARDED : 'discarded',
+ SETTING_METADATA : 'setting_metadata',
+ FAILED_METADATA : 'failed_metadata'
+};
+
+
+//==============================================================================
+var HistoryItemView = LoggingView.extend({
+ // view for HistoryItem model above
+
+ // uncomment this out see log messages
+ logger : console,
+
tagName : "div",
className : "historyItemContainer",
- icons : {
- display : 'path to icon',
- edit_attr : 'path to icon',
- delete : 'path to icon',
- download : 'path to icon',
- details : 'path to icon',
- rerun : 'path to icon',
- tags : 'path to icon',
- annotations : 'path to icon',
+
+ // ................................................................................ SET UP
+ initialize : function(){
+ this.log( this + '.initialize:', this, this.model );
+ return this;
+ },
+
+ // ................................................................................ RENDER MAIN
+ //??: this style builds an entire, new DOM tree - is that what we want??
+ render : function(){
+ this.log( this + '.model:', this.model );
+ var id = this.model.get( 'id' ),
+ state = this.model.get( 'state' );
+
+ this.$el.attr( 'id', 'historyItemContainer-' + id );
+
+ var itemWrapper = $( '<div/>' ).attr( 'id', 'historyItem-' + id )
+ .addClass( 'historyItemWrapper' ).addClass( 'historyItem' )
+ .addClass( 'historyItem-' + state );
+
+ itemWrapper.append( this._render_purgedWarning() );
+ itemWrapper.append( this._render_deletionWarning() );
+ itemWrapper.append( this._render_visibleWarning() );
+
+ itemWrapper.append( this._render_titleBar() );
+
+ this.body = $( this._render_body() );
+ itemWrapper.append( this.body );
+
+ // set up canned behavior (bootstrap, popupmenus, editable_text, etc.)
+ itemWrapper.find( '.tooltip' ).tooltip({ placement : 'bottom' });
+
+ //TODO: broken
+ var popupmenus = itemWrapper.find( '[popupmenu]' );
+ popupmenus.each( function( i, menu ){
+ menu = $( menu );
+ make_popupmenu( menu );
+ });
+
+ //TODO: better transition/method than this...
+ this.$el.children().remove();
+ return this.$el.append( itemWrapper );
+ },
+
+ // ................................................................................ RENDER WARNINGS
+ //TODO: refactor into generalized warning widget/view
+ //TODO: refactor the three following - too much common ground
+ _render_purgedWarning : function(){
+ // Render warnings for purged
+ //this.log( this + '_render_purgedWarning' );
+ var warning = null;
+ if( this.model.get( 'purged' ) ){
+ warning = $( HistoryItemView.STRINGS.purgedMsg );
+ }
+ //this.log( 'warning:', warning );
+ return warning;
+ },
+
+ _render_deletionWarning : function(){
+ //this.log( this + '_render_deletionWarning' );
+ // Render warnings for deleted items (and links: undelete and purge)
+ //pre: this.model.purge_url will be undefined if trans.app.config.allow_user_dataset_purge=False
+ var warningElem = null;
+ if( this.model.get( 'deleted' ) ){
+ var warning = '';
+
+ if( this.model.get( 'undelete_url' ) ){
+ warning += HistoryItemView.TEMPLATES.undeleteLink( this.model.attributes );
+ }
+ if( this.model.get( 'purge_url' ) ){
+ warning += HistoryItemView.TEMPLATES.purgeLink( this.model.attributes );
+ }
+ // wrap it in the standard warning msg
+ warningElem = $( HistoryItemView.TEMPLATES.warningMsg({ warning: warning }) );
+ }
+ //this.log( 'warning:', warning );
+ return warningElem;
+ },
+
+ _render_visibleWarning : function(){
+ //this.log( this + '_render_visibleWarning' );
+ // Render warnings for hidden items (and link: unhide)
+ var warningElem = null;
+ if( !this.model.get( 'visible' ) && this.model.get( 'unhide_url' ) ){
+ var warning = HistoryItemView.TEMPLATES.hiddenMsg( this.model.attributes );
+
+ // wrap it in the standard warning msg
+ warningElem = $( HistoryItemView.TEMPLATES.warningMsg({ warning: warning }) );
+ }
+ //this.log( 'warning:', warning );
+ return warningElem;
+ },
+
+ // ................................................................................ RENDER TITLEBAR
+ _render_titleBar : function(){
+ var titleBar = $( '<div class="historyItemTitleBar" style="overflow: hidden"></div>' );
+ titleBar.append( this._render_titleButtons() );
+ titleBar.append( '<span class="state-icon"></span>' );
+ titleBar.append( this._render_titleLink() );
+ return titleBar;
+ },
+
+ // ................................................................................ DISPLAY, EDIT ATTR, DELETE
+ _render_titleButtons : function(){
+ // render the display, edit attr and delete icon-buttons
+ var buttonDiv = $( '<div class="historyItemButtons"></div>' ),
+ for_editing = this.model.get( 'for_editing' );
+
+ // don't show display, edit while uploading
+ if( this.model.get( 'state' ) !== HistoryItem.STATES.UPLOAD ){
+ buttonDiv.append( this._render_displayButton() );
+
+ if( for_editing ){ buttonDiv.append( this._render_editButton() ); }
+ }
+ if( for_editing ){ buttonDiv.append( this._render_deleteButton() ); }
+ return buttonDiv;
+ },
+
+ //TODO: refactor the following three - use extend for new href (with model data or something else)
+ //TODO: move other data (non-href) into {} in view definition, cycle over those keys in _titlebuttons
+ //TODO: move disabled span data into view def, move logic into _titlebuttons
+ _render_displayButton : function(){
+ // render the display icon-button
+ // show a disabled display if the data's been purged
+ if( this.model.get( 'purged' ) ){
+ return $( '<span class="icon-button display_disabled tooltip" ' +
+ 'title="Cannot display datasets removed from disk"></span>' );
+ }
+ var id = this.model.get( 'id' ),
+ displayBtnData = {
+ //TODO: localized title
+ title : 'Display data in browser',
+ //TODO: need generated url here
+ href : '/datasets/' + id + '/display/?preview=True',
+ target : ( this.model.get( 'for_editing' ) )?( 'galaxy_main' ):( '' ),
+ classes : [ 'icon-button', 'tooltip', 'display' ],
+ dataset_id : id
+ };
+ return $( linkHTMLTemplate( displayBtnData ) );
+ },
+
+ _render_editButton : function(){
+ // render the edit attr icon-button
+ var id = this.model.get( 'id' ),
+ purged = this.model.get( 'purged' ),
+ deleted = this.model.get( 'deleted' );
+
+ if( deleted || purged ){
+ if( !purged ){
+ return $( '<span class="icon-button edit_disabled tooltip" ' +
+ 'title="Undelete dataset to edit attributes"></span>' );
+ } else {
+ return $( '<span class="icon-button edit_disabled tooltip" ' +
+ 'title="Cannot edit attributes of datasets removed from disk"></span>' );
+ }
+ }
+ return $( linkHTMLTemplate({
+ title : 'Edit attributes',
+ //TODO: need generated url here
+ href : '/datasets/' + id + '/edit',
+ target : 'galaxy_main',
+ classes : [ 'icon-button', 'tooltip', 'edit' ]
+ }) );
+ },
+
+ _render_deleteButton : function(){
+ // render the delete icon-button
+ var id = this.model.get( 'id' ),
+ purged = this.model.get( 'purged' ),
+ deleted = this.model.get( 'deleted' );
+
+ //??: WHAAAA? can_edit == deleted??
+ // yes! : (history_common.mako) can_edit=( not ( data.deleted or data.purged ) )
+ if( purged || deleted ){
+ return $( '<span title="Dataset is already deleted" ' +
+ 'class="icon-button delete_disabled tooltip"></span>' );
+ }
+ return $( linkHTMLTemplate({
+ title : 'Delete',
+ //TODO: need generated url here
+ href : '/datasets/' + id + '/delete?show_deleted_on_refresh=False',
+ target : 'galaxy_main',
+ id : 'historyItemDeleter-' + id,
+ classes : [ 'icon-button', 'tooltip', 'delete' ]
+ }));
+ },
+
+ _render_titleLink : function(){
+ // render the title (hda name)
+ var h_name = this.model.get( 'name' ),
+ hid = this.model.get( 'hid' );
+ title = ( hid + ': ' + h_name );
+ return $( linkHTMLTemplate({
+ href : 'javascript:void(0);',
+ text : '<span class="historyItemTitle">' + title + '</span>'
+ }));
+ },
+
+ // ................................................................................ RENDER BODY
+ // _render_body fns for the various states
+ _render_body_not_viewable : function( parent ){
+ parent.append( $( '<div>You do not have permission to view dataset.</div>' ) );
+ },
+
+ _render_body_uploading : function( parent ){
+ parent.append( $( '<div>Dataset is uploading</div>' ) );
+ },
+
+ _render_body_queued : function( parent ){
+ parent.append( $( '<div>Job is waiting to run.</div>' ) );
+ parent.append( this._render_showParamsAndRerun() );
+ },
+
+ _render_body_running : function( parent ){
+ parent.append( '<div>Job is currently running.</div>' );
+ parent.append( this._render_showParamsAndRerun() );
+ },
+
+ _render_body_error : function( parent ){
+ if( !this.model.get( 'purged' ) ){
+ parent.append( $( '<div>' + this.model.get( 'misc_blurb' ) + '</div>' ) );
+ }
+ parent.append( ( 'An error occurred running this job: '
+ + '<i>' + $.trim( this.model.get( 'misc_info' ) ) + '</i>' ) );
+
+ var actionBtnDiv = $( this._render_showParamsAndRerun() );
+ // bug report button
+ //NOTE: these are shown _before_ info, rerun so use _prepend_
+ if( this.model.get( 'for_editing' ) ){
+ actionBtnDiv.prepend( $( linkHTMLTemplate({
+ title : 'View or report this error',
+ href : this.model.get( 'report_errors_url' ),
+ target : 'galaxy_main',
+ classes : [ 'icon-button', 'tooltip', 'bug' ]
+ })));
+ }
+ if( this.model.hasData() ){
+ //TODO: render_download_links( data, dataset_id )
+ // download dropdown
+ actionBtnDiv.prepend( this._render_downloadLinks() );
+ }
+ parent.append( actionBtnDiv );
+ },
+
+ _render_body_discarded : function( parent ){
+ parent.append( '<div>The job creating this dataset was cancelled before completion.</div>' );
+ parent.append( this._render_showParamsAndRerun() );
+ },
+
+ _render_body_setting_metadata : function( parent ){
+ parent.append( $( '<div>Metadata is being auto-detected.</div>' ) );
+ },
+
+ _render_body_empty : function( parent ){
+ //TODO: replace i with dataset-misc-info class
+ //?? why are we showing the file size when we know it's zero??
+ parent.append( $( '<div>No data: <i>' + this.model.get( 'misc_blurb' ) + '</i></div>' ) );
+ parent.append( this._render_showParamsAndRerun() );
+ },
+
+ _render_body_failed_metadata : function( parent ){
+ // add a message box about the failure at the top of the body, then...
+ var warningMsgText = 'An error occurred setting the metadata for this dataset.';
+ if( this.model.isEditable() ){
+ var editLink = linkHTMLTemplate({
+ text : 'set it manually or retry auto-detection',
+ href : this.model.get( 'edit_url' ),
+ target : 'galaxy_main'
+ });
+ warningMsgText += 'You may be able to ' + editLink + '.';
+ }
+ parent.append( $( HistoryItemView.TEMPLATES.warningMsg({ warning: warningMsgText }) ) );
+
+ //...render the remaining body as STATES.OK (only diff between these states is the box above)
+ this._render_body_ok( parent );
+ },
+
+ _render_body_ok : function( parent ){
+ // most common state renderer and the most complicated
+
+ // build the summary info (using template and dbkey data)
+ parent.append( this._render_hdaSummary() );
+
+ if( this.model.get( 'misc_info' ) ){
+ parent.append( $( '<div class="info">Info: ' + this.model.get( 'misc_info' ) + '</div>' ) );
+ }
+
+ // hasData
+ if( this.model.hasData() ){
+ var actionBtnDiv = $( '<div/>' );
+
+ // render download links, show_params
+ actionBtnDiv.append( this._render_downloadLinks() );
+ actionBtnDiv.append( $( linkHTMLTemplate({
+ title : 'View details',
+ href : this.model.get( 'show_params_url' ),
+ target : 'galaxy_main',
+ classes : [ 'icon-button', 'tooltip', 'information' ]
+ })));
+
+ // if for_editing
+ if( this.model.get( 'for_editing' ) ){
+
+ // rerun
+ actionBtnDiv.append( $( linkHTMLTemplate({
+ title : 'Run this job again',
+ href : this.model.get( 'rerun_url' ),
+ target : 'galaxy_main',
+ classes : [ 'icon-button', 'tooltip', 'arrow-circle' ]
+ })));
+
+ if( this.model.get( 'trackster_urls' ) ){
+ // link to trackster
+ var trackster_urls = this.model.get( 'trackster_urls' );
+ actionBtnDiv.append( $( linkHTMLTemplate({
+ title : 'View in Trackster',
+ href : "javascript:void(0)",
+ classes : [ 'icon-button', 'tooltip', 'chart_curve', 'trackster-add' ],
+ // prob just _.extend
+ 'data-url' : trackster_urls[ 'data-url' ],
+ 'action-url': trackster_urls[ 'action-url' ],
+ 'new-url' : trackster_urls[ 'new-url' ]
+ })));
+ }
+
+ // if trans.user
+ if( this.model.get( 'retag_url' ) && this.model.get( 'annotate_url' ) ){
+ // tags, annotation buttons and display areas
+ var tagsAnnotationsBtns = $( '<div style="float: right;"></div>' );
+ tagsAnnotationsBtns.append( $( linkHTMLTemplate({
+ title : 'Edit dataset tags',
+ target : 'galaxy_main',
+ href : this.model.get( 'retag_url' ),
+ classes : [ 'icon-button', 'tooltip', 'tags' ]
+ })));
+ tagsAnnotationsBtns.append( $( linkHTMLTemplate({
+ title : 'Edit dataset annotation',
+ target : 'galaxy_main',
+ href : this.model.get( 'annotation_url' ),
+ classes : [ 'icon-button', 'tooltip', 'annotate' ]
+ })));
+ actionBtnDiv.append( tagsAnnotationsBtns );
+ actionBtnDiv.append( '<div style="clear: both"></div>' );
+
+ var tagArea = $( '<div class="tag-area" style="display: none">' );
+ tagArea.append( '<strong>Tags:</strong>' );
+ tagArea.append( '<div class="tag-elt"></div>' );
+ actionBtnDiv.append( tagArea );
+
+ var annotationArea = $( ( '<div id="${dataset_id}-annotation-area"'
+ + ' class="annotation-area" style="display: none">' ) );
+ annotationArea.append( '<strong>Annotation:</strong>' );
+ annotationArea.append( ( '<div id="${dataset_id}-annotation-elt" '
+ + 'style="margin: 1px 0px 1px 0px" class="annotation-elt tooltip editable-text" '
+ + 'title="Edit dataset annotation"></div>' ) );
+ actionBtnDiv.append( annotationArea );
+ }
+ }
+ // clear div
+ actionBtnDiv.append( '<div style="clear: both;"></div>' );
+ parent.append( actionBtnDiv );
+
+ var display_appsDiv = $( '<div/>' );
+ if( this.model.get( 'display_apps' ) ){
+ var display_apps = this.model.get( 'display_apps' ),
+ display_app_span = $( '<span/>' );
+
+ //TODO: grrr...somethings not in the right scope here
+ for( app_name in display_apps ){
+ //TODO: to template
+ var display_app = display_apps[ app_name ],
+ display_app_HTML = app_name + ' ';
+ for( location_name in display_app ){
+ display_app_HTML += linkHTMLTemplate({
+ text : location_name,
+ href : display_app[ location_name ].url,
+ target : display_app[ location_name ].target
+ }) + ' ';
+ }
+ display_app_span.append( display_app_HTML );
+ }
+ display_appsDiv.append( display_app_span );
+ }
+ //display_appsDiv.append( '<br />' );
+ parent.append( display_appsDiv );
+
+ } else if( this.model.get( 'for_editing' ) ){
+ parent.append( this._render_showParamsAndRerun() );
+ }
+
+ parent.append( this._render_peek() );
+ },
+
+ _render_body : function(){
+ this.log( this + '_render_body' );
+ var state = this.model.get( 'state' ),
+ for_editing = this.model.get( 'for_editing' );
+ this.log( 'state:', state, 'for_editing', for_editing );
+
+ //TODO: incorrect id (encoded - use hid?)
+ var body = $( '<div/>' )
+ .attr( 'id', 'info-' + this.model.get( 'id' ) )
+ .addClass( 'historyItemBody' )
+ .attr( 'style', 'display: block' );
+
+ switch( state ){
+ case HistoryItem.STATES.NOT_VIEWABLE :
+ this._render_body_not_viewable( body );
+ break;
+ case HistoryItem.STATES.UPLOAD :
+ this._render_body_uploading( body );
+ break;
+ case HistoryItem.STATES.QUEUED :
+ this._render_body_queued( body );
+ break;
+ case HistoryItem.STATES.RUNNING :
+ this._render_body_running( body );
+ break;
+ case HistoryItem.STATES.ERROR :
+ this._render_body_error( body );
+ break;
+ case HistoryItem.STATES.DISCARDED :
+ this._render_body_discarded( body );
+ break;
+ case HistoryItem.STATES.SETTING_METADATA :
+ this._render_body_setting_metadata( body );
+ break;
+ case HistoryItem.STATES.EMPTY :
+ this._render_body_empty( body );
+ break;
+ case HistoryItem.STATES.FAILED_METADATA :
+ this._render_body_failed_metadata( body );
+ break;
+ case HistoryItem.STATES.OK :
+ this._render_body_ok( body );
+ break;
+ default:
+ //??: no body?
+ body.append( $( '<div>Error: unknown dataset state "' + state + '".</div>' ) );
+ }
+
+ body.append( '<div style="clear: both"></div>' );
+ if( this.model.get( 'bodyIsShown' ) === false ){
+ body.hide();
+ }
+ return body;
+ },
+
+ _render_hdaSummary : function(){
+ // default span rendering
+ var dbkeyHTML = _.template( '<span class="<%= dbkey %>"><%= dbkey %></span>',
+ { dbkey: this.model.get( 'metadata_dbkey' ) } );
+ // if there's no dbkey and it's editable : render a link to editing in the '?'
+ if( this.model.get( 'metadata_dbkey' ) === '?' && this.model.isEditable() ){
+ dbkeyHTML = linkHTMLTemplate({
+ text : this.model.get( 'metadata_dbkey' ),
+ href : this.model.get( 'edit_url' ),
+ target : 'galaxy_main'
+ });
+ }
+ return ( HistoryItemView.TEMPLATES.hdaSummary(
+ _.extend({ dbkeyHTML: dbkeyHTML }, this.model.attributes ) ) );
+ },
+
+ _render_showParamsAndRerun : function(){
+ //TODO??: generalize to _render_actionButtons, pass in list of 'buttons' to render, default to these two
+ var actionBtnDiv = $( '<div/>' );
+ actionBtnDiv.append( $( linkHTMLTemplate({
+ title : 'View details',
+ href : this.model.get( 'show_params_url' ),
+ target : 'galaxy_main',
+ classes : [ 'icon-button', 'tooltip', 'information' ]
+ })));
+ if( this.model.get( 'for_editing' ) ){
+ actionBtnDiv.append( $( linkHTMLTemplate({
+ title : 'Run this job again',
+ href : this.model.get( 'rerun_url' ),
+ target : 'galaxy_main',
+ classes : [ 'icon-button', 'tooltip', 'arrow-circle' ]
+ })));
+ }
+ return actionBtnDiv;
+ },
+
+ _render_downloadLinks : function(){
+ // return either: a single download icon-button (if there are no meta files)
+ // or a popupmenu with links to download assoc. meta files (if there are meta files)
+
+ // don't show anything if the data's been purged
+ if( this.model.get( 'purged' ) ){ return null; }
+
+ var downloadLink = linkHTMLTemplate({
+ title : 'Download',
+ href : this.model.get( 'download_url' ),
+ classes : [ 'icon-button', 'tooltip', 'disk' ]
+ });
+
+ // if no metafiles, return only the main download link
+ var download_meta_urls = this.model.get( 'download_meta_urls' );
+ if( !download_meta_urls ){
+ return downloadLink;
+ }
+
+ // build the popupmenu for downloading main, meta files
+ var popupmenu = $( '<div popupmenu="dataset-' + this.model.get( 'id' ) + '-popup"></div>' );
+ popupmenu.append( linkHTMLTemplate({
+ text : 'Download Dataset',
+ title : 'Download',
+ href : this.model.get( 'download_url' ),
+ classes : [ 'icon-button', 'tooltip', 'disk' ]
+ }));
+ popupmenu.append( '<a>Additional Files</a>' );
+ for( file_type in download_meta_urls ){
+ popupmenu.append( linkHTMLTemplate({
+ text : 'Download ' + file_type,
+ href : download_meta_urls[ file_type ],
+ classes : [ 'action-button' ]
+ }));
+ }
+ var menuButton = $( ( '<div style="float:left;" class="menubutton split popup"'
+ + ' id="dataset-${dataset_id}-popup"></div>' ) );
+ menuButton.append( downloadLink );
+ popupmenu.append( menuButton );
+ return popupmenu;
+ },
+
+ _render_peek : function(){
+ if( !this.model.get( 'peek' ) ){ return null; }
+ return $( '<div/>' ).append(
+ $( '<pre/>' )
+ .attr( 'id', 'peek' + this.model.get( 'id' ) )
+ .addClass( 'peek' )
+ .append( this.model.get( 'peek' ) )
+ );
+ },
+
+ // ................................................................................ EVENTS
+ events : {
+ 'click .historyItemTitle' : 'toggleBodyVisibility'
+ },
+
+ // ................................................................................ STATE CHANGES / MANIPULATION
+ toggleBodyVisibility : function(){
+ this.log( this + '.toggleBodyVisibility' );
+ this.$el.find( '.historyItemBody' ).toggle();
+ },
+
+ // ................................................................................ UTILTIY
+ toString : function(){
+ var modelString = ( this.model )?( this.model + '' ):( '' );
+ return 'HistoryItemView(' + modelString + ')';
+ }
+});
+
+//------------------------------------------------------------------------------
+HistoryItemView.TEMPLATES = {};
+
+//TODO: move next one out - doesn't belong
+HistoryItemView.TEMPLATES.warningMsg =
+ _.template( '<div class=warningmessagesmall><strong><%= warning %></strong></div>' );
+
+
+//??TODO: move into functions?
+HistoryItemView.TEMPLATES.undeleteLink = _.template(
+ 'This dataset has been deleted. ' +
+ 'Click <a href="<%= undelete_url %>" class="historyItemUndelete" id="historyItemUndeleter-{{ id }}" ' +
+ ' target="galaxy_history">here</a> to undelete it.' );
+
+HistoryItemView.TEMPLATES.purgeLink = _.template(
+ ' or <a href="<%= purge_url %>" class="historyItemPurge" id="historyItemPurger-{{ id }}"' +
+ ' target="galaxy_history">here</a> to immediately remove it from disk.' );
+
+HistoryItemView.TEMPLATES.hiddenMsg = _.template(
+ 'This dataset has been hidden. ' +
+ 'Click <a href="<%= unhide_url %>" class="historyItemUnhide" id="historyItemUnhider-{{ id }}" ' +
+ ' target="galaxy_history">here</a> to unhide it.' );
+
+//TODO: contains localized strings
+HistoryItemView.TEMPLATES.hdaSummary = _.template([
+ '<%= misc_blurb %><br />',
+ 'format: <span class="<%= data_type %>"><%= data_type %></span>, ',
+ 'database: <%= dbkeyHTML %>'
+].join( '' ));
+
+
+//------------------------------------------------------------------------------
+HistoryItemView.STRINGS = {};
+
+HistoryItemView.STRINGS.purgedMsg = HistoryItemView.TEMPLATES.warningMsg(
+ { warning: 'This dataset has been deleted and removed from disk.' });
+
+
+//==============================================================================
+var HistoryCollection = Backbone.Collection.extend({
+ model : HistoryItem,
+
+ toString : function(){
+ return ( 'HistoryCollection()' );
+ }
+});
+
+
+//==============================================================================
+var History = LoggingModel.extend({
+
+ // uncomment this out see log messages
+ logger : console,
+
+ // values from api (may need more)
+ defaults : {
+ id : '',
+ name : '',
+ state : '',
+ state_details : {
+ discarded : 0,
+ empty : 0,
+ error : 0,
+ failed_metadata : 0,
+ ok : 0,
+ queued : 0,
+ running : 0,
+ setting_metadata: 0,
+ upload : 0
+ }
+ },
+
+ initialize : function( data, history_datasets ){
+ this.log( this + '.initialize', data, history_datasets );
+ this.items = new HistoryCollection();
+ },
+
+ loadDatasetsAsHistoryItems : function( datasets ){
+ // adds the given dataset/Item data to historyItems
+ // and updates this.state based on their states
+ //pre: datasets is a list of objs
+ //this.log( this + '.loadDatasets', datasets );
+ var self = this,
+ selfID = this.get( 'id' ),
+ stateDetails = this.get( 'state_details' );
+
+ _.each( datasets, function( dataset, index ){
+ self.log( 'loading dataset: ', dataset, index );
+
+ // create an item sending along the history_id as well
+ var historyItem = new HistoryItem(
+ _.extend( dataset, { history_id: selfID } ) );
+ self.log( 'as History:', historyItem );
+ self.items.add( historyItem );
+
+ // add item's state to running totals in stateDetails
+ var itemState = dataset.state;
+ stateDetails[ itemState ] += 1;
+ });
+
+ // get overall History state from totals
+ this.set( 'state_details', stateDetails );
+ this._stateFromStateDetails();
+ return this;
+ },
+
+ _stateFromStateDetails : function(){
+ // sets this.state based on current historyItems' states
+ // ported from api/histories.traverse
+ //pre: state_details is current counts of dataset/item states
+ this.set( 'state', '' );
+ var stateDetails = this.get( 'state_details' );
+
+ //TODO: make this more concise
+ if( ( stateDetails.error > 0 )
+ || ( stateDetails.failed_metadata > 0 ) ){
+ this.set( 'state', HistoryItem.STATES.ERROR );
+
+ } else if( ( stateDetails.running > 0 )
+ || ( stateDetails.setting_metadata > 0 ) ){
+ this.set( 'state', HistoryItem.STATES.RUNNING );
+
+ } else if( stateDetails.queued > 0 ){
+ this.set( 'state', HistoryItem.STATES.QUEUED );
+
+ } else if( stateDetails.ok === this.items.length ){
+ this.set( 'state', HistoryItem.STATES.OK );
+
+ } else {
+ throw( '_stateFromStateDetails: unable to determine '
+ + 'history state from state details: ' + this.state_details );
+ }
+ return this;
+ },
+
+ toString : function(){
+ var nameString = ( this.get( 'name' ) )?
+ ( ',' + this.get( 'name' ) ) : ( '' );
+ return 'History(' + this.get( 'id' ) + nameString + ')';
+ }
+});
+
+//------------------------------------------------------------------------------
+var HistoryView = LoggingView.extend({
+ // view for the HistoryCollection (as per current right hand panel)
+
+ // uncomment this out see log messages
+ logger : console,
+
+ // direct attachment to existing element
+ el : 'body.historyPage',
+
+ initialize : function(){
+ this.log( this + '.initialize' );
+ this.itemViews = [];
+ var parent = this;
+ this.model.items.each( function( item ){
+ var itemView = new HistoryItemView({ model: item });
+ parent.itemViews.push( itemView );
+ });
+ //itemViews.reverse();
},
render : function(){
- this.$el.append( 'div' )
+ this.log( this + '.render' );
+
+ // render to temp, move all at once, remove temp holder
+ //NOTE!: render in reverse (newest on top) via prepend (instead of append)
+ var tempDiv = $( '<div/>' );
+ _.each( this.itemViews, function( view ){
+ tempDiv.prepend( view.render() );
+ });
+ this.$el.append( tempDiv.children() );
+ tempDiv.remove();
},
+ toString : function(){
+ var nameString = this.model.get( 'name' ) || '';
+ return 'HistoryView(' + nameString + ')';
+ }
});
+//==============================================================================
+//USE_MOCK_DATA = true;
+if( window.USE_MOCK_DATA ){
+ mockHistory = {};
+ mockHistory.data = {
+
+ template : {
+ id : 'a799d38679e985db',
+ name : 'template',
+ data_type : 'fastq',
+ file_size : 226297533,
+ genome_build : '?',
+ metadata_data_lines : 0,
+ metadata_dbkey : '?',
+ metadata_sequences : 0,
+ misc_blurb : '215.8 MB',
+ misc_info : 'uploaded fastq file',
+ model_class : 'HistoryDatasetAssociation',
+ download_url : '',
+ state : 'ok',
+ visible : true,
+ deleted : false,
+ purged : false,
+
+ hid : 0,
+ //TODO: move to history
+ for_editing : true,
+ //for_editing : false,
+
+ //?? not needed
+ //can_edit : true,
+ //can_edit : false,
+
+ //TODO: move into model functions (build there (and cache?))
+ //!! be careful with adding these accrd. to permissions
+ //!! IOW, don't send them via template/API if the user doesn't have perms to use
+ //!! (even if they don't show up)
+ undelete_url : 'example.com/undelete',
+ purge_url : 'example.com/purge',
+ unhide_url : 'example.com/unhide',
+
+ display_url : 'example.com/display',
+ edit_url : 'example.com/edit',
+ delete_url : 'example.com/delete',
+
+ show_params_url : 'example.com/show_params',
+ rerun_url : 'example.com/rerun',
+
+ retag_url : 'example.com/retag',
+ annotate_url : 'example.com/annotate',
+
+ peek : [
+ '<table cellspacing="0" cellpadding="3"><tr><th>1.QNAME</th><th>2.FLAG</th><th>3.RNAME</th><th>4.POS</th><th>5.MAPQ</th><th>6.CIGAR</th><th>7.MRNM</th><th>8.MPOS</th><th>9.ISIZE</th><th>10.SEQ</th><th>11.QUAL</th><th>12.OPT</th></tr>',
+ '<tr><td colspan="100%">@SQ SN:gi|87159884|ref|NC_007793.1| LN:2872769</td></tr>',
+ '<tr><td colspan="100%">@PG ID:bwa PN:bwa VN:0.5.9-r16</td></tr>',
+ '<tr><td colspan="100%">HWUSI-EAS664L:15:64HOJAAXX:1:1:13280:968 73 gi|87159884|ref|NC_007793.1| 2720169 37 101M = 2720169 0 NAATATGACATTATTTTCAAAACAGCTGAAAATTTAGACGTACCGATTTATCTACATCCCGCGCCAGTTAACAGTGACATTTATCAATCATACTATAAAGG !!!!!!!!!!$!!!$!!!!!$!!!!!!$!$!$$$!!$!!$!!!!!!!!!!!$!</td></tr>',
+ '<tr><td colspan="100%">!!!$!$!$$!!$$!!$!!!!!!!!!!!!!!!!!!!!!!!!!!$!!$!! XT:A:U NM:i:1 SM:i:37 AM:i:0 X0:i:1 X1:i:0 XM:i:1 XO:i:0 XG:i:0 MD:Z:0A100</td></tr>',
+ '<tr><td colspan="100%">HWUSI-EAS664L:15:64HOJAAXX:1:1:13280:968 133 gi|87159884|ref|NC_007793.1| 2720169 0 * = 2720169 0 NAAACTGTGGCTTCGTTNNNNNNNNNNNNNNNGTGANNNNNNNNNNNNNNNNNNNGNNNNNNNNNNNNNNNNNNNNCNAANNNNNNNNNNNNNNNNNNNNN !!!!!!!!!!!!$!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!</td></tr>',
+ '<tr><td colspan="100%">!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!</td></tr>',
+ '</table>'
+ ].join( '' )
+ }
+
+ };
+ _.extend( mockHistory.data, {
+
+ //deleted, purged, visible
+ deleted :
+ _.extend( _.clone( mockHistory.data.template ),
+ { deleted : true }),
+ //purged :
+ // _.extend( _.clone( mockHistory.data.template ),
+ // { purged : true, deleted : true }),
+ purgedNotDeleted :
+ _.extend( _.clone( mockHistory.data.template ),
+ { purged : true }),
+ notvisible :
+ _.extend( _.clone( mockHistory.data.template ),
+ { visible : false }),
+ hasDisplayApps :
+ _.extend( _.clone( mockHistory.data.template ),
+ { display_apps : {
+ 'display in IGB' : {
+ Web: "/display_application/63cd3858d057a6d1/igb_bam/Web",
+ Local: "/display_application/63cd3858d057a6d1/igb_bam/Local"
+ }
+ }
+ }
+ ),
+ canTrackster :
+ _.extend( _.clone( mockHistory.data.template ),
+ { trackster_urls : {
+ 'data-url' : "example.com/trackster-data",
+ 'action-url' : "example.com/trackster-action",
+ 'new-url' : "example.com/trackster-new"
+ }
+ }
+ ),
+ zeroSize :
+ _.extend( _.clone( mockHistory.data.template ),
+ { file_size : 0 }),
+
+ hasMetafiles :
+ _.extend( _.clone( mockHistory.data.template ), {
+ download_meta_urls : {
+ 'bam_index' : "example.com/bam-index"
+ }
+ }),
+
+ //states
+ upload :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.UPLOAD }),
+ queued :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.QUEUED }),
+ running :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.RUNNING }),
+ empty :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.EMPTY }),
+ error :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.ERROR }),
+ discarded :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.DISCARDED }),
+ setting_metadata :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.SETTING_METADATA }),
+ failed_metadata :
+ _.extend( _.clone( mockHistory.data.template ),
+ { state : HistoryItem.STATES.FAILED_METADATA })
+
+ });
+
+ $( document ).ready( function(){
+ //mockHistory.views.deleted.logger = console;
+ mockHistory.items = {};
+ mockHistory.views = {};
+ for( key in mockHistory.data ){
+ mockHistory.items[ key ] = new HistoryItem( mockHistory.data[ key ] );
+ mockHistory.items[ key ].set( 'name', key );
+ mockHistory.views[ key ] = new HistoryItemView({ model : mockHistory.items[ key ] });
+ //console.debug( 'view: ', mockHistory.views[ key ] );
+ $( 'body' ).append( mockHistory.views[ key ].render() );
+ }
+ });
+}
-//==============================================================================
-var History = Backbone.Collection.extend({
- // a collection of HistoryItems
-
- // from: http://localhost:8080/api/histories/f2db41e1fa331b3e
- /*
- {
- "contents_url": "/api/histories/f2db41e1fa331b3e/contents",
- "id": "f2db41e1fa331b3e",
- "name": "one",
- "state": "ok",
- "state_details": {
- "discarded": 0,
- "empty": 0,
- "error": 0,
- "failed_metadata": 0,
- "new": 0,
- "ok": 4,
- "queued": 0,
- "running": 0,
- "setting_metadata": 0,
- "upload": 0
- }
- }
- */
-
- // from: http://localhost:8080/api/histories/f2db41e1fa331b3e/contents
- // (most are replicated in HistoryItem)
- /*
- [
- {
- "id": "f2db41e1fa331b3e",
- "name": "LTCF-2-19_GTGAAA_L001_R1_001.fastq",
- "type": "file",
- "url": "/api/histories/f2db41e1fa331b3e/contents/f2db41e1fa331b3e"
- },
- {
- "id": "f597429621d6eb2b",
- "name": "LTCF-2-19_GTGAAA_L001_R2_001.fastq",
- "type": "file",
- "url": "/api/histories/f2db41e1fa331b3e/contents/f597429621d6eb2b"
- },
- {
- "id": "1cd8e2f6b131e891",
- "name": "FASTQ Groomer on data 1",
- "type": "file",
- "url": "/api/histories/f2db41e1fa331b3e/contents/1cd8e2f6b131e891"
- },
- {
- "id": "ebfb8f50c6abde6d",
- "name": "FASTQ Groomer on data 2",
- "type": "file",
- "url": "/api/histories/f2db41e1fa331b3e/contents/ebfb8f50c6abde6d"
- },
- {
- "id": "33b43b4e7093c91f",
- "name": "Sa.04-02981.fasta",
- "type": "file",
- "url": "/api/histories/f2db41e1fa331b3e/contents/33b43b4e7093c91f"
- }
- ]
- */
-});
-
-//..............................................................................
-var HistoryCollectionView = BaseView.extend({
- // view for the HistoryCollection (as per current right hand panel)
- tagName : "body",
- className : "historyCollection",
-
- render : function(){
-
- },
-
-});
-
diff -r 01ed2f462dd7709876458b031d786d277d1f72f3 -r 9beb21b505e9485e27902d75bf342beea80c6b7e templates/root/history_common.mako
--- a/templates/root/history_common.mako
+++ b/templates/root/history_common.mako
@@ -104,6 +104,8 @@
%endif
></a>
%endif
+
+ ## edit attr button
%if for_editing:
%if data.deleted and not data.purged:
<span title="Undelete dataset to edit attributes" class="icon-button edit_disabled tooltip"></span>
@@ -113,7 +115,10 @@
<a class="icon-button edit tooltip" title='${_("Edit attributes")}' href="${h.url_for( controller='dataset', action='edit', dataset_id=dataset_id )}" target="galaxy_main"></a>
%endif
%endif
+
%endif
+
+ ## delete button
%if for_editing:
%if can_edit:
<a class="icon-button delete tooltip" title='${_("Delete")}' href="${h.url_for( controller='dataset', action='delete', dataset_id=dataset_id, show_deleted_on_refresh=show_deleted_on_refresh )}" id="historyItemDeleter-${dataset_id}"></a>
https://bitbucket.org/galaxy/galaxy-central/changeset/b822f138cd97/
changeset: b822f138cd97
user: carlfeberhard
date: 2012-08-31 18:12:28
summary: merge; add temporary template to commit
affected #: 310 files
Diff too large to display.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/df1452d451ec/
changeset: df1452d451ec
user: greg
date: 2012-08-30 20:40:28
summary: Change the implementation for importing proprietary datatype class modules that are included in installed tool shed repositories. This approach will correctly import proprietary datatype classes whose file names conflict with a Python standard library module name (e.g., if a proprietary datatype class file is named xml.py, it will conflict with the Python standard library's xml module when attempting to import it).
affected #: 1 file
diff -r 9e35739c2da19583f24784d1b06e656dede2643a -r df1452d451ece4e33b720c481077ad792bb0b71a lib/galaxy/datatypes/registry.py
--- a/lib/galaxy/datatypes/registry.py
+++ b/lib/galaxy/datatypes/registry.py
@@ -1,7 +1,7 @@
"""
Provides mapping between extensions and datatypes, mime-types, etc.
"""
-import os, sys, tempfile, threading, logging
+import os, sys, tempfile, threading, logging, imp
import data, tabular, interval, images, sequence, qualityscore, genetics, xml, coverage, tracks, chrominfo, binary, assembly, ngsindex
import galaxy.util
from galaxy.util.odict import odict
@@ -55,10 +55,9 @@
being installed. Since installation is occurring after the datatypes registry has been initialized, its
contents cannot be overridden by new introduced conflicting data types.
"""
- def __import_module( full_path, datatype_module ):
- sys.path.insert( 0, full_path )
- imported_module = __import__( datatype_module )
- sys.path.pop( 0 )
+ def __import_module( full_path, datatype_module, datatype_class_name ):
+ open_file_obj, file_name, description = imp.find_module( datatype_module, [ full_path ] )
+ imported_module = imp.load_module( datatype_class_name, open_file_obj, file_name, description )
return imported_module
if root_dir and config:
handling_proprietary_datatypes = False
@@ -130,12 +129,12 @@
datatype_module = fields[0]
datatype_class_name = fields[1]
datatype_class = None
- if proprietary_path and proprietary_datatype_module:
+ if proprietary_path and proprietary_datatype_module and datatype_class_name:
# We need to change the value of sys.path, so do it in a way that is thread-safe.
lock = threading.Lock()
lock.acquire( True )
try:
- imported_module = __import_module( proprietary_path, proprietary_datatype_module )
+ imported_module = __import_module( proprietary_path, proprietary_datatype_module, datatype_class_name )
if imported_module not in self.imported_modules:
self.imported_modules.append( imported_module )
if hasattr( imported_module, datatype_class_name ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Fix for loading a tool from a tool config located in a temporary working directory.
by Bitbucket 30 Aug '12
by Bitbucket 30 Aug '12
30 Aug '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/9e35739c2da1/
changeset: 9e35739c2da1
user: greg
date: 2012-08-30 16:54:12
summary: Fix for loading a tool from a tool config located in a temporary working directory.
affected #: 2 files
diff -r 9ff18845d24b092a9297d1ec4a803b7e0e5fae4a -r 9e35739c2da19583f24784d1b06e656dede2643a lib/galaxy/webapps/community/controllers/common.py
--- a/lib/galaxy/webapps/community/controllers/common.py
+++ b/lib/galaxy/webapps/community/controllers/common.py
@@ -722,7 +722,7 @@
error, correction_msg = handle_sample_tool_data_table_conf_file( trans.app, tool_data_table_config )
manifest_ctx, ctx_file = get_ctx_file_path_from_manifest( tool_config_filename, repo, changeset_revision )
if manifest_ctx and ctx_file:
- tool, message = load_tool_from_tmp_config( trans, manifest_ctx, ctx_file, work_dir )
+ tool, message = load_tool_from_tmp_config( trans, repo, manifest_ctx, ctx_file, work_dir )
try:
shutil.rmtree( work_dir )
except:
@@ -732,7 +732,7 @@
# Reset the tool_data_tables by loading the empty tool_data_table_conf.xml file.
reset_tool_data_tables( trans.app )
return repository, tool, message
-def load_tool_from_tmp_config( trans, ctx, ctx_file, work_dir ):
+def load_tool_from_tmp_config( trans, repo, ctx, ctx_file, work_dir ):
tool = None
message = ''
tmp_tool_config = get_named_tmpfile_from_ctx( ctx, ctx_file, work_dir )
diff -r 9ff18845d24b092a9297d1ec4a803b7e0e5fae4a -r 9e35739c2da19583f24784d1b06e656dede2643a lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -2243,7 +2243,7 @@
work_dir = tempfile.mkdtemp()
manifest_ctx, ctx_file = get_ctx_file_path_from_manifest( relative_path_to_tool_config, repo, changeset_revision )
if manifest_ctx and ctx_file:
- tool, message = load_tool_from_tmp_config( trans, manifest_ctx, ctx_file, work_dir )
+ tool, message = load_tool_from_tmp_config( trans, repo, manifest_ctx, ctx_file, work_dir )
break
if guid:
tool_lineage = self.get_versions_of_tool( trans, repository, repository_metadata, guid )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Enhancements, fixes and code cleanup related to setting metadata and loading and displaying tools in the tool shed.
by Bitbucket 30 Aug '12
by Bitbucket 30 Aug '12
30 Aug '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/9ff18845d24b/
changeset: 9ff18845d24b
user: greg
date: 2012-08-30 16:42:55
summary: Enhancements, fixes and code cleanup related to setting metadata and loading and displaying tools in the tool shed.
affected #: 4 files
diff -r 32030702cf48527a17a648e828ec6658bf96841c -r 9ff18845d24b092a9297d1ec4a803b7e0e5fae4a lib/galaxy/util/shed_util.py
--- a/lib/galaxy/util/shed_util.py
+++ b/lib/galaxy/util/shed_util.py
@@ -334,7 +334,7 @@
noupdate=False,
rev=util.listify( str( ctx_rev ) ) )
def copy_sample_file( app, filename, dest_path=None ):
- """Copy xxx.loc.sample to dest_path/xxx.loc.sample and dest_path/xxx.loc. The default value for dest_path is ~/tool-data."""
+ """Copy xxx.sample to dest_path/xxx.sample and dest_path/xxx. The default value for dest_path is ~/tool-data."""
if dest_path is None:
dest_path = os.path.abspath( app.config.tool_data_path )
sample_file_name = strip_path( filename )
@@ -573,13 +573,13 @@
if datatypes_config:
metadata_dict = generate_datatypes_metadata( datatypes_config, metadata_dict )
# Get the relative path to all sample files included in the repository for storage in the repository's metadata.
- sample_files = get_sample_files_from_disk( repository_files_dir=files_dir,
- relative_install_dir=relative_install_dir,
- resetting_all_metadata_on_repository=resetting_all_metadata_on_repository )
- if sample_files:
- metadata_dict[ 'sample_files' ] = sample_files
+ sample_file_metadata_paths, sample_file_copy_paths = get_sample_files_from_disk( repository_files_dir=files_dir,
+ relative_install_dir=relative_install_dir,
+ resetting_all_metadata_on_repository=resetting_all_metadata_on_repository )
+ if sample_file_metadata_paths:
+ metadata_dict[ 'sample_files' ] = sample_file_metadata_paths
# Copy all sample files included in the repository to a single directory location so we can load tools that depend on them.
- for sample_file in sample_files:
+ for sample_file in sample_file_copy_paths:
copy_sample_file( app, sample_file, dest_path=work_dir )
# If the list of sample files includes a tool_data_table_conf.xml.sample file, laad it's table elements into memory.
relative_path, filename = os.path.split( sample_file )
@@ -608,21 +608,9 @@
print "Error parsing %s", full_path, ", exception: ", str( e )
is_tool = False
if is_tool:
- try:
- tool = app.toolbox.load_tool( full_path )
- except KeyError, e:
- tool = None
- invalid_tool_configs.append( name )
- error_message = 'This file requires an entry for "%s" in the tool_data_table_conf.xml file. Upload a file ' % str( e )
- error_message += 'named tool_data_table_conf.xml.sample to the repository that includes the required entry to correct '
- error_message += 'this error. '
- invalid_file_tups.append( ( name, error_message ) )
- except Exception, e:
- tool = None
- invalid_tool_configs.append( name )
- invalid_file_tups.append( ( name, str( e ) ) )
+ tool, valid, error_message = load_tool_from_config( app, full_path )
if tool is not None:
- invalid_files_and_errors_tups = check_tool_input_params( app, files_dir, name, tool, sample_files, webapp=webapp )
+ invalid_files_and_errors_tups = check_tool_input_params( app, files_dir, name, tool, sample_file_metadata_paths, webapp=webapp )
can_set_metadata = True
for tup in invalid_files_and_errors_tups:
if name in tup:
@@ -1077,7 +1065,8 @@
if resetting_all_metadata_on_repository:
# Keep track of the location where the repository is temporarily cloned so that we can strip it when setting metadata.
work_dir = repository_files_dir
- sample_files = []
+ sample_file_metadata_paths = []
+ sample_file_copy_paths = []
for root, dirs, files in os.walk( repository_files_dir ):
if root.find( '.hg' ) < 0:
for name in files:
@@ -1088,10 +1077,15 @@
if stripped_path_to_sample_file.startswith( '/' ):
stripped_path_to_sample_file = stripped_path_to_sample_file[ 1: ]
relative_path_to_sample_file = os.path.join( relative_install_dir, stripped_path_to_sample_file )
+ if os.path.exists( relative_path_to_sample_file ):
+ sample_file_copy_paths.append( relative_path_to_sample_file )
+ else:
+ sample_file_copy_paths.append( full_path_to_sample_file )
else:
relative_path_to_sample_file = os.path.join( root, name )
- sample_files.append( relative_path_to_sample_file )
- return sample_files
+ sample_file_copy_paths.append( relative_path_to_sample_file )
+ sample_file_metadata_paths.append( relative_path_to_sample_file )
+ return sample_file_metadata_paths, sample_file_copy_paths
def get_shed_tool_conf_dict( app, shed_tool_conf ):
"""
Return the in-memory version of the shed_tool_conf file, which is stored in the config_elems entry
@@ -1404,6 +1398,22 @@
def load_installed_display_applications( app, installed_repository_dict, deactivate=False ):
# Load or deactivate proprietary datatype display applications
app.datatypes_registry.load_display_applications( installed_repository_dict=installed_repository_dict, deactivate=deactivate )
+def load_tool_from_config( app, full_path ):
+ try:
+ tool = app.toolbox.load_tool( full_path )
+ valid = True
+ error_message = None
+ except KeyError, e:
+ tool = None
+ valid = False
+ error_message = 'This file requires an entry for "%s" in the tool_data_table_conf.xml file. Upload a file ' % str( e )
+ error_message += 'named tool_data_table_conf.xml.sample to the repository that includes the required entry to correct '
+ error_message += 'this error. '
+ except Exception, e:
+ tool = None
+ valid = False
+ error_message = str( e )
+ return tool, valid, error_message
def open_repository_files_folder( trans, folder_path ):
try:
files_list = get_repository_files( trans, folder_path )
diff -r 32030702cf48527a17a648e828ec6658bf96841c -r 9ff18845d24b092a9297d1ec4a803b7e0e5fae4a lib/galaxy/webapps/community/controllers/common.py
--- a/lib/galaxy/webapps/community/controllers/common.py
+++ b/lib/galaxy/webapps/community/controllers/common.py
@@ -1,13 +1,14 @@
-import os, string, socket, logging, simplejson, binascii, tempfile
+import os, string, socket, logging, simplejson, binascii, tempfile, filecmp
from time import strftime
from datetime import *
from galaxy.datatypes.checkers import *
from galaxy.tools import *
from galaxy.util.json import from_json_string, to_json_string
from galaxy.util.hash_util import *
-from galaxy.util.shed_util import clone_repository, generate_metadata_for_changeset_revision, get_changectx_for_changeset, get_config_from_disk
-from galaxy.util.shed_util import get_configured_ui, get_named_tmpfile_from_ctx, handle_sample_tool_data_table_conf_file, INITIAL_CHANGELOG_HASH
-from galaxy.util.shed_util import reset_tool_data_tables, reversed_upper_bounded_changelog, strip_path
+from galaxy.util.shed_util import check_tool_input_params, clone_repository, copy_sample_file, generate_metadata_for_changeset_revision
+from galaxy.util.shed_util import get_changectx_for_changeset, get_config_from_disk, get_configured_ui, get_named_tmpfile_from_ctx
+from galaxy.util.shed_util import handle_sample_tool_data_table_conf_file, INITIAL_CHANGELOG_HASH, load_tool_from_config, reset_tool_data_tables
+from galaxy.util.shed_util import reversed_upper_bounded_changelog, strip_path
from galaxy.web.base.controller import *
from galaxy.webapps.community import model
from galaxy.model.orm import *
@@ -105,6 +106,42 @@
trans.sa_session.flush()
return item_rating
+def add_tool_versions( trans, id, repository_metadata, changeset_revisions ):
+ # Build a dictionary of { 'tool id' : 'parent tool id' } pairs for each tool in repository_metadata.
+ metadata = repository_metadata.metadata
+ tool_versions_dict = {}
+ for tool_dict in metadata.get( 'tools', [] ):
+ # We have at least 2 changeset revisions to compare tool guids and tool ids.
+ parent_id = get_parent_id( trans,
+ id,
+ tool_dict[ 'id' ],
+ tool_dict[ 'version' ],
+ tool_dict[ 'guid' ],
+ changeset_revisions )
+ tool_versions_dict[ tool_dict[ 'guid' ] ] = parent_id
+ if tool_versions_dict:
+ repository_metadata.tool_versions = tool_versions_dict
+ trans.sa_session.add( repository_metadata )
+ trans.sa_session.flush()
+def can_use_tool_config_disk_file( trans, repository, repo, file_path, changeset_revision ):
+ """
+ Determine if repository's tool config file on disk can be used. This method is restricted to tool config files since, with the
+ exception of tool config files, multiple files with the same name will likely be in various directories in the repository and we're
+ comparing file names only (not relative paths).
+ """
+ if not file_path or not os.path.exists( file_path ):
+ # The file no longer exists on disk, so it must have been deleted at some previous point in the change log.
+ return False
+ if changeset_revision == repository.tip:
+ return True
+ file_name = strip_path( file_path )
+ latest_version_of_file = get_latest_tool_config_revision_from_repository_manifest( repo, file_name, changeset_revision )
+ can_use_disk_file = filecmp.cmp( file_path, latest_version_of_file )
+ try:
+ os.unlink( latest_version_of_file )
+ except:
+ pass
+ return can_use_disk_file
def changeset_is_malicious( trans, id, changeset_revision, **kwd ):
"""Check the malicious flag in repository metadata for a specified change set"""
repository_metadata = get_repository_metadata_by_changeset_revision( trans, id, changeset_revision )
@@ -221,6 +258,16 @@
else:
return 'subset'
return 'not equal and not subset'
+def copy_disk_sample_files_to_dir( trans, repo_files_dir, dest_path ):
+ sample_files = []
+ for root, dirs, files in os.walk( repo_files_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name.endswith( '.sample' ):
+ relative_path = os.path.join( root, name )
+ copy_sample_file( trans.app, relative_path, dest_path=dest_path )
+ sample_files_copied.append( name )
+ return sample_files
def copy_file_from_disk( filename, repo_dir, dir ):
file_path = None
found = False
@@ -240,9 +287,7 @@
tmp_filename = None
return tmp_filename
def copy_file_from_manifest( repo, ctx, filename, dir ):
- """
- Copy the latest version of the file named filename from the repository manifest to the directory to which dir refers.
- """
+ """Copy the latest version of the file named filename from the repository manifest to the directory to which dir refers."""
for changeset in reversed_upper_bounded_changelog( repo, ctx ):
changeset_ctx = repo.changectx( changeset )
fctx = get_file_context_from_ctx( changeset_ctx, filename )
@@ -277,7 +322,7 @@
return '%s://%s%s/repos/%s/%s' % ( protocol, username, base, repository.user.username, repository.name )
else:
return '%s/repos/%s/%s' % ( base_url, repository.user.username, repository.name )
-def generate_message_for_invalid_tools( invalid_file_tups, repository, metadata_dict, as_html=True ):
+def generate_message_for_invalid_tools( invalid_file_tups, repository, metadata_dict, as_html=True, displaying_invalid_tool=False ):
if as_html:
new_line = '<br/>'
bold_start = '<b>'
@@ -287,12 +332,13 @@
bold_start = ''
bold_end = ''
message = ''
- if metadata_dict:
- message += "Metadata was defined for some items in revision '%s'. " % str( repository.tip )
- message += "Correct the following problems if necessary and reset metadata.%s" % new_line
- else:
- message += "Metadata cannot be defined for revision '%s' so this revision cannot be automatically " % str( repository.tip )
- message += "installed into a local Galaxy instance. Correct the following problems and reset metadata.%s" % new_line
+ if not displaying_invalid_tool:
+ if metadata_dict:
+ message += "Metadata was defined for some items in revision '%s'. " % str( repository.tip )
+ message += "Correct the following problems if necessary and reset metadata.%s" % new_line
+ else:
+ message += "Metadata cannot be defined for revision '%s' so this revision cannot be automatically " % str( repository.tip )
+ message += "installed into a local Galaxy instance. Correct the following problems and reset metadata.%s" % new_line
for itc_tup in invalid_file_tups:
tool_file, exception_msg = itc_tup
if exception_msg.find( 'No such file or directory' ) >= 0:
@@ -322,6 +368,19 @@
repository.name,
tool.id,
tool.version )
+def get_absolute_path_to_file_in_repository( repo_files_dir, file_name ):
+ file_path = None
+ found = False
+ for root, dirs, files in os.walk( repo_files_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name == file_name:
+ file_path = os.path.abspath( os.path.join( root, name ) )
+ found = True
+ break
+ if found:
+ break
+ return file_path
def get_category( trans, id ):
"""Get a category from the database"""
return trans.sa_session.query( trans.model.Category ).get( trans.security.decode_id( id ) )
@@ -353,12 +412,44 @@
if deleted:
return 'DELETED'
return None
+def get_ctx_file_path_from_manifest( filename, repo, changeset_revision ):
+ """Get the ctx file path for the latest revision of filename from the repository manifest up to the value of changeset_revision."""
+ stripped_filename = strip_path( filename )
+ for changeset in reversed_upper_bounded_changelog( repo, changeset_revision ):
+ manifest_changeset_revision = str( repo.changectx( changeset ) )
+ manifest_ctx = repo.changectx( changeset )
+ for ctx_file in manifest_ctx.files():
+ ctx_file_name = strip_path( ctx_file )
+ if ctx_file_name == stripped_filename:
+ return manifest_ctx, ctx_file
+ return None, None
def get_latest_repository_metadata( trans, decoded_repository_id ):
"""Get last metadata defined for a specified repository from the database"""
return trans.sa_session.query( trans.model.RepositoryMetadata ) \
.filter( trans.model.RepositoryMetadata.table.c.repository_id == decoded_repository_id ) \
.order_by( trans.model.RepositoryMetadata.table.c.id.desc() ) \
.first()
+def get_latest_tool_config_revision_from_repository_manifest( repo, filename, changeset_revision ):
+ """
+ Get the latest revision of a tool config file named filename from the repository manifest up to the value of changeset_revision.
+ This method is restricted to tool_config files rather than any file since it is likely that, with the exception of tool config files,
+ multiple files will have the same name in various directories within the repository.
+ """
+ stripped_filename = strip_path( filename )
+ for changeset in reversed_upper_bounded_changelog( repo, changeset_revision ):
+ manifest_ctx = repo.changectx( changeset )
+ for ctx_file in manifest_ctx.files():
+ ctx_file_name = strip_path( ctx_file )
+ if ctx_file_name == stripped_filename:
+ fctx = manifest_ctx[ ctx_file ]
+ fh = tempfile.NamedTemporaryFile( 'wb' )
+ tmp_filename = fh.name
+ fh.close()
+ fh = open( tmp_filename, 'wb' )
+ fh.write( fctx.data() )
+ fh.close()
+ return tmp_filename
+ return None
def get_list_of_copied_sample_files( repo, ctx, dir ):
"""
Find all sample files (files in the repository with the special .sample extension) in the reversed repository manifest up to ctx. Copy
@@ -451,8 +542,8 @@
def get_repository_metadata_by_changeset_revision( trans, id, changeset_revision ):
"""Get metadata for a specified repository change set from the database"""
# Make sure there are no duplicate records, and return the single unique record for the changeset_revision. Duplicate records were somehow
- # creatd in the past. This may or may not be resolved, so when it is confirmed that the cause of duplicate records has been corrected, tweak
- # this method accordingly.
+ # created in the past. The cause of this issue has been resolved, but we'll leave this method as is for a while longer to ensure all duplicate
+ # records are removed.
all_metadata_records = trans.sa_session.query( trans.model.RepositoryMetadata ) \
.filter( and_( trans.model.RepositoryMetadata.table.c.repository_id == trans.security.decode_id( id ),
trans.model.RepositoryMetadata.table.c.changeset_revision == changeset_revision ) ) \
@@ -577,90 +668,61 @@
log.exception( "An error occurred sending a tool shed repository update alert by email." )
def is_downloadable( metadata_dict ):
return 'datatypes' in metadata_dict or 'tools' in metadata_dict or 'workflows' in metadata_dict
-def load_tool( trans, config_file ):
- """Load a single tool from the file named by `config_file` and return an instance of `Tool`."""
- # Parse XML configuration file and get the root element
- tree = util.parse_xml( config_file )
- root = tree.getroot()
- if root.tag == 'tool':
- # Allow specifying a different tool subclass to instantiate
- if root.find( "type" ) is not None:
- type_elem = root.find( "type" )
- module = type_elem.get( 'module', 'galaxy.tools' )
- cls = type_elem.get( 'class' )
- mod = __import__( module, globals(), locals(), [cls] )
- ToolClass = getattr( mod, cls )
- elif root.get( 'tool_type', None ) is not None:
- ToolClass = tool_types.get( root.get( 'tool_type' ) )
- else:
- ToolClass = Tool
- return ToolClass( config_file, root, trans.app )
- return None
def load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config_filename ):
"""
Return a loaded tool whose tool config file name (e.g., filtering.xml) is the value of tool_config_filename. The value of changeset_revision
is a valid (downloadable) changset revision. The tool config will be located in the repository manifest between the received valid changeset
revision and the first changeset revision in the repository, searching backwards.
"""
- def load_from_tmp_config( toolbox, ctx, ctx_file, work_dir ):
- tool = None
- message = ''
- tmp_tool_config = get_named_tmpfile_from_ctx( ctx, ctx_file, work_dir )
- if tmp_tool_config:
- element_tree = util.parse_xml( tmp_tool_config )
- element_tree_root = element_tree.getroot()
- # Look for code files required by the tool config.
- tmp_code_files = []
- for code_elem in element_tree_root.findall( 'code' ):
- code_file_name = code_elem.get( 'file' )
- tmp_code_file_name = copy_file_from_manifest( repo, ctx, code_file_name, work_dir )
- if tmp_code_file_name:
- tmp_code_files.append( tmp_code_file_name )
- try:
- tool = toolbox.load_tool( tmp_tool_config )
- except KeyError, e:
- message = '<b>%s</b> - This file requires an entry for %s in the tool_data_table_conf.xml file. ' % ( tool_config_filename, str( e ) )
- message += 'Upload a file named tool_data_table_conf.xml.sample to the repository that includes the required entry to correct this error. '
- except Exception, e:
- message = 'Error loading tool: %s. ' % str( e )
- for tmp_code_file in tmp_code_files:
- try:
- os.unlink( tmp_code_file )
- except:
- pass
- try:
- os.unlink( tmp_tool_config )
- except:
- pass
- return tool, message
original_tool_data_path = trans.app.config.tool_data_path
- tool_config_filename = strip_path( tool_config_filename )
repository = get_repository( trans, repository_id )
repo_files_dir = repository.repo_path
repo = hg.repository( get_configured_ui(), repo_files_dir )
- ctx = get_changectx_for_changeset( repo, changeset_revision )
message = ''
+ tool = None
+ can_use_disk_file = False
+ tool_config_filepath = get_absolute_path_to_file_in_repository( repo_files_dir, tool_config_filename )
work_dir = tempfile.mkdtemp()
- sample_files, deleted_sample_files = get_list_of_copied_sample_files( repo, ctx, dir=work_dir )
- if sample_files:
- trans.app.config.tool_data_path = work_dir
- if 'tool_data_table_conf.xml.sample' in sample_files:
- # Load entries into the tool_data_tables if the tool requires them.
- tool_data_table_config = os.path.join( work_dir, 'tool_data_table_conf.xml' )
- error, correction_msg = handle_sample_tool_data_table_conf_file( trans.app, tool_data_table_config )
- found = False
- # Get the latest revision of the tool config from the repository manifest up to the value of changeset_revision.
- for changeset in reversed_upper_bounded_changelog( repo, changeset_revision ):
- manifest_changeset_revision = str( repo.changectx( changeset ) )
- manifest_ctx = repo.changectx( changeset )
- for ctx_file in manifest_ctx.files():
- ctx_file_name = strip_path( ctx_file )
- if ctx_file_name == tool_config_filename:
- found = True
- break
- if found:
- tool, message = load_from_tmp_config( trans.app.toolbox, manifest_ctx, ctx_file, work_dir )
- break
+ can_use_disk_file = can_use_tool_config_disk_file( trans, repository, repo, tool_config_filepath, changeset_revision )
+ if can_use_disk_file:
+ # Copy all sample files from disk to a temporary directory since the sample files may be in multiple directories.
+ sample_files = copy_disk_sample_files_to_dir( trans, repo_files_dir, work_dir )
+ if sample_files:
+ trans.app.config.tool_data_path = work_dir
+ if 'tool_data_table_conf.xml.sample' in sample_files:
+ # Load entries into the tool_data_tables if the tool requires them.
+ tool_data_table_config = os.path.join( work_dir, 'tool_data_table_conf.xml' )
+ error, correction_msg = handle_sample_tool_data_table_conf_file( trans.app, tool_data_table_config )
+ tool, valid, message = load_tool_from_config( trans.app, tool_config_filepath )
+ if tool is not None:
+ invalid_files_and_errors_tups = check_tool_input_params( trans.app,
+ repo_files_dir,
+ tool_config_filename,
+ tool,
+ sample_files,
+ webapp='community' )
+ if invalid_files_and_errors_tups:
+ message = generate_message_for_invalid_tools( invalid_files_and_errors_tups,
+ repository,
+ metadata_dict=None,
+ as_html=True,
+ displaying_invalid_tool=True )
+ status = 'error'
+ else:
+ # The desired version of the tool config is no longer on disk, so create a temporary work environment and copy the tool config and dependent files.
+ ctx = get_changectx_for_changeset( repo, changeset_revision )
+ # We're not currently doing anything with the returned list of deleted_sample_files here. It is intended to help handle sample files that are in
+ # the manifest, but have been deleted from disk.
+ sample_files, deleted_sample_files = get_list_of_copied_sample_files( repo, ctx, dir=work_dir )
+ if sample_files:
+ trans.app.config.tool_data_path = work_dir
+ if 'tool_data_table_conf.xml.sample' in sample_files:
+ # Load entries into the tool_data_tables if the tool requires them.
+ tool_data_table_config = os.path.join( work_dir, 'tool_data_table_conf.xml' )
+ error, correction_msg = handle_sample_tool_data_table_conf_file( trans.app, tool_data_table_config )
+ manifest_ctx, ctx_file = get_ctx_file_path_from_manifest( tool_config_filename, repo, changeset_revision )
+ if manifest_ctx and ctx_file:
+ tool, message = load_tool_from_tmp_config( trans, manifest_ctx, ctx_file, work_dir )
try:
shutil.rmtree( work_dir )
except:
@@ -669,50 +731,32 @@
trans.app.config.tool_data_path = original_tool_data_path
# Reset the tool_data_tables by loading the empty tool_data_table_conf.xml file.
reset_tool_data_tables( trans.app )
+ return repository, tool, message
+def load_tool_from_tmp_config( trans, ctx, ctx_file, work_dir ):
+ tool = None
+ message = ''
+ tmp_tool_config = get_named_tmpfile_from_ctx( ctx, ctx_file, work_dir )
+ if tmp_tool_config:
+ element_tree = util.parse_xml( tmp_tool_config )
+ element_tree_root = element_tree.getroot()
+ # Look for code files required by the tool config.
+ tmp_code_files = []
+ for code_elem in element_tree_root.findall( 'code' ):
+ code_file_name = code_elem.get( 'file' )
+ tmp_code_file_name = copy_file_from_manifest( repo, ctx, code_file_name, work_dir )
+ if tmp_code_file_name:
+ tmp_code_files.append( tmp_code_file_name )
+ tool, valid, message = load_tool_from_config( trans.app, tmp_tool_config )
+ for tmp_code_file in tmp_code_files:
+ try:
+ os.unlink( tmp_code_file )
+ except:
+ pass
+ try:
+ os.unlink( tmp_tool_config )
+ except:
+ pass
return tool, message
-def load_tool_from_tmp_directory( trans, repo, repo_dir, ctx, filename, dir ):
- is_tool_config = False
- tool = None
- valid = False
- error_message = ''
- tmp_config = get_named_tmpfile_from_ctx( ctx, filename, dir )
- if tmp_config:
- if not ( check_binary( tmp_config ) or check_image( tmp_config ) or check_gzip( tmp_config )[ 0 ]
- or check_bz2( tmp_config )[ 0 ] or check_zip( tmp_config ) ):
- try:
- # Make sure we're looking at a tool config and not a display application config or something else.
- element_tree = util.parse_xml( tmp_config )
- element_tree_root = element_tree.getroot()
- is_tool_config = element_tree_root.tag == 'tool'
- except Exception, e:
- log.debug( "Error parsing %s, exception: %s" % ( tmp_config, str( e ) ) )
- is_tool_config = False
- if is_tool_config:
- # Load entries into the tool_data_tables if the tool requires them.
- tool_data_table_config = copy_file_from_manifest( repo, ctx, 'tool_data_table_conf.xml.sample', dir )
- if tool_data_table_config:
- error, correction_msg = handle_sample_tool_data_table_conf_file( trans.app, tool_data_table_config )
- # Look for code files required by the tool config. The directory to which dir refers should be removed by the caller.
- for code_elem in element_tree_root.findall( 'code' ):
- code_file_name = code_elem.get( 'file' )
- if not os.path.exists( os.path.join( dir, code_file_name ) ):
- tmp_code_file_name = copy_file_from_disk( code_file_name, repo_dir, dir )
- if tmp_code_file_name is None:
- tmp_code_file_name = copy_file_from_manifest( repo, ctx, code_file_name, dir )
- try:
- tool = load_tool( trans, tmp_config )
- valid = True
- except KeyError, e:
- valid = False
- error_message = 'This file requires an entry for "%s" in the tool_data_table_conf.xml file. Upload a file ' % str( e )
- error_message += 'named tool_data_table_conf.xml.sample to the repository that includes the required entry to correct '
- error_message += 'this error. '
- except Exception, e:
- valid = False
- error_message = str( e )
- # Reset the tool_data_tables by loading the empty tool_data_table_conf.xml file.
- reset_tool_data_tables( trans.app )
- return is_tool_config, valid, tool, error_message
def new_tool_metadata_required( trans, repository, metadata_dict ):
"""
Compare the last saved metadata for each tool in the repository with the new metadata in metadata_dict to determine if a new repository_metadata
@@ -895,23 +939,6 @@
Set metadata using the repository's current disk files, returning specific error messages (if any) to alert the repository owner that the changeset
has problems.
"""
- def add_tool_versions( trans, id, repository_metadata, changeset_revisions ):
- # Build a dictionary of { 'tool id' : 'parent tool id' } pairs for each tool in repository_metadata.
- metadata = repository_metadata.metadata
- tool_versions_dict = {}
- for tool_dict in metadata.get( 'tools', [] ):
- # We have at least 2 changeset revisions to compare tool guids and tool ids.
- parent_id = get_parent_id( trans,
- id,
- tool_dict[ 'id' ],
- tool_dict[ 'version' ],
- tool_dict[ 'guid' ],
- changeset_revisions )
- tool_versions_dict[ tool_dict[ 'guid' ] ] = parent_id
- if tool_versions_dict:
- repository_metadata.tool_versions = tool_versions_dict
- trans.sa_session.add( repository_metadata )
- trans.sa_session.flush()
message = ''
status = 'done'
encoded_id = trans.security.encode_id( repository.id )
diff -r 32030702cf48527a17a648e828ec6658bf96841c -r 9ff18845d24b092a9297d1ec4a803b7e0e5fae4a lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -9,9 +9,9 @@
from galaxy.web.framework.helpers import time_ago, iff, grids
from galaxy.util.json import from_json_string, to_json_string
from galaxy.model.orm import *
-from galaxy.util.shed_util import create_repo_info_dict, get_changectx_for_changeset, get_configured_ui, get_repository_file_contents, NOT_TOOL_CONFIGS
-from galaxy.util.shed_util import open_repository_files_folder, reversed_lower_upper_bounded_changelog, reversed_upper_bounded_changelog, strip_path
-from galaxy.util.shed_util import to_html_escaped, update_repository, url_join
+from galaxy.util.shed_util import create_repo_info_dict, get_changectx_for_changeset, get_configured_ui, get_repository_file_contents, load_tool_from_config
+from galaxy.util.shed_util import NOT_TOOL_CONFIGS, open_repository_files_folder, reversed_lower_upper_bounded_changelog, reversed_upper_bounded_changelog
+from galaxy.util.shed_util import strip_path, to_html_escaped, update_repository, url_join
from galaxy.tool_shed.encoding_util import *
from common import *
@@ -251,10 +251,13 @@
# TODO: improve performance by adding a db table associating users with repositories for which they have write access.
username = kwd[ 'username' ]
clause_list = []
- for repository in trans.sa_session.query( self.model_class ):
- allow_push_usernames = repository.allow_push.split( ',' )
- if username in allow_push_usernames:
- clause_list.append( self.model_class.table.c.id == repository.id )
+ for repository in trans.sa_session.query( self.model_class ) \
+ .filter( self.model_class.table.c.deleted == False ):
+ allow_push = repository.allow_push
+ if allow_push:
+ allow_push_usernames = allow_push.split( ',' )
+ if username in allow_push_usernames:
+ clause_list.append( self.model_class.table.c.id == repository.id )
if clause_list:
return trans.sa_session.query( self.model_class ) \
.filter( or_( *clause_list ) ) \
@@ -912,8 +915,7 @@
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
webapp = get_webapp( trans, **kwd )
- repository = get_repository( trans, repository_id )
- tool, message = load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
+ repository, tool, message = load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
tool_state = self.__new_state( trans )
is_malicious = changeset_is_malicious( trans, repository_id, repository.tip )
try:
@@ -1402,14 +1404,12 @@
return trans.response.send_redirect( url )
@web.expose
def load_invalid_tool( self, trans, repository_id, tool_config, changeset_revision, **kwd ):
- # FIXME: loading an invalid tool should display an appropriate message as to why the tool is invalid. This worked until recently.
params = util.Params( kwd )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'error' )
webapp = get_webapp( trans, **kwd )
repository_clone_url = generate_clone_url( trans, repository_id )
- repository = get_repository( trans, repository_id )
- tool, error_message = load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
+ repository, tool, error_message = load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_config )
tool_state = self.__new_state( trans )
is_malicious = changeset_is_malicious( trans, repository_id, repository.tip )
try:
@@ -2220,6 +2220,8 @@
status = params.get( 'status', 'done' )
webapp = get_webapp( trans, **kwd )
repository = get_repository( trans, repository_id )
+ repo_files_dir = repository.repo_path
+ repo = hg.repository( get_configured_ui(), repo_files_dir )
tool_metadata_dict = {}
tool_lineage = []
tool = None
@@ -2230,12 +2232,18 @@
if 'tools' in metadata:
for tool_metadata_dict in metadata[ 'tools' ]:
if tool_metadata_dict[ 'id' ] == tool_id:
+ relative_path_to_tool_config = tool_metadata_dict[ 'tool_config' ]
guid = tool_metadata_dict[ 'guid' ]
- try:
- # We may be attempting to load a tool that no longer exists in the repository tip.
- tool = load_tool( trans, os.path.abspath( tool_metadata_dict[ 'tool_config' ] ) )
- except:
- tool = None
+ full_path = os.path.abspath( relative_path_to_tool_config )
+ can_use_disk_file = can_use_tool_config_disk_file( trans, repository, repo, full_path, changeset_revision )
+ if can_use_disk_file:
+ tool, valid, message = load_tool_from_config( trans.app, full_path )
+ else:
+ # We're attempting to load a tool using a config that no longer exists on disk.
+ work_dir = tempfile.mkdtemp()
+ manifest_ctx, ctx_file = get_ctx_file_path_from_manifest( relative_path_to_tool_config, repo, changeset_revision )
+ if manifest_ctx and ctx_file:
+ tool, message = load_tool_from_tmp_config( trans, manifest_ctx, ctx_file, work_dir )
break
if guid:
tool_lineage = self.get_versions_of_tool( trans, repository, repository_metadata, guid )
diff -r 32030702cf48527a17a648e828ec6658bf96841c -r 9ff18845d24b092a9297d1ec4a803b7e0e5fae4a lib/galaxy/webapps/community/controllers/workflow.py
--- a/lib/galaxy/webapps/community/controllers/workflow.py
+++ b/lib/galaxy/webapps/community/controllers/workflow.py
@@ -48,7 +48,7 @@
self.errors = None
for tool_dict in tools_metadata:
if self.tool_id in [ tool_dict[ 'id' ], tool_dict[ 'guid' ] ]:
- self.tool, message = load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_dict[ 'tool_config' ] )
+ repository, self.tool, message = load_tool_from_changeset_revision( trans, repository_id, changeset_revision, tool_dict[ 'tool_config' ] )
if message and self.tool is None:
self.errors = 'unavailable'
break
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: rmarenco: Missing verification of the conditon before checking the conditional parameters
by Bitbucket 29 Aug '12
by Bitbucket 29 Aug '12
29 Aug '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/32030702cf48/
changeset: 32030702cf48
user: rmarenco
date: 2012-08-29 23:27:02
summary: Missing verification of the conditon before checking the conditional parameters
affected #: 1 file
diff -r ff74a24aa175a28975abdc623d34c2b805725a4f -r 32030702cf48527a17a648e828ec6658bf96841c tools/ngs_rna/express_wrapper.xml
--- a/tools/ngs_rna/express_wrapper.xml
+++ b/tools/ngs_rna/express_wrapper.xml
@@ -49,7 +49,7 @@
<data format="txt" name="params" from_work_dir="params.xprs"/><data format="txt" name="results" from_work_dir="results.xprs"/><data format="txt" name="varcov" from_work_dir="varcov.xprs">
- <filter>additional_params[ 'calc_covar' ] == "yes"</filter>
+ <filter>additional_params[ 'use_additional' ] == "yes" and additional_params[ 'calc_covar' ] == "yes"</filter></data></outputs>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/ff74a24aa175/
changeset: ff74a24aa175
user: inithello
date: 2012-08-29 22:08:01
summary: Upgrade mercurial egg to 2.2.3
affected #: 1 file
diff -r 61a2c343081f2dc194caaacb3c379392353caa7a -r ff74a24aa175a28975abdc623d34c2b805725a4f eggs.ini
--- a/eggs.ini
+++ b/eggs.ini
@@ -17,7 +17,7 @@
ctypes = 1.0.2
DRMAA_python = 0.2
MarkupSafe = 0.12
-mercurial = 2.1.2
+mercurial = 2.2.3
MySQL_python = 1.2.3c1
numpy = 1.6.0
pbs_python = 4.1.0
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0