galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
May 2013
- 1 participants
- 218 discussions
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/975f94138a55/
Changeset: 975f94138a55
User: dan
Date: 2013-05-24 19:06:21
Summary: merge
Affected #: 0 files
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/b13e85effb67/
Changeset: b13e85effb67
User: dan
Date: 2013-05-24 19:04:32
Summary: Fix for MetadataInDataTableColumnValidator becoming stale when a tool data table is updated during Galaxy runtime.
Affected #: 3 files
diff -r 7e820c43d641d784d73526637e3d55ede5f8327e -r b13e85effb678a5f820630bee89a6a233299b5b8 lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -161,10 +161,23 @@
self.tool_data_file = None
self.tool_data_path = tool_data_path
self.missing_index_file = None
+ # increment this variable any time a new entry is added, or when the table is totally reloaded
+ # This value has no external meaning, and does not represent an abstract version of the underlying data
+ self._loaded_content_version = 1
def get_empty_field_by_name( self, name ):
return self.empty_field_values.get( name, self.empty_field_value )
-
+
+ def _add_entry( self, entry, persist=False, persist_on_error=False, **kwd ):
+ raise NotImplementedError( "Abstract method" )
+
+ def add_entry( self, entry, persist=False, persist_on_error=False, **kwd ):
+ self._add_entry( entry, persist=persist, persist_on_error=persist_on_error, **kwd )
+ self._loaded_content_version += 1
+ return self._loaded_content_version
+
+ def is_current_version( self, other_version ):
+ return self._loaded_content_version == other_version
class TabularToolDataTable( ToolDataTable ):
"""
@@ -234,6 +247,9 @@
def get_fields( self ):
return self.data
+
+ def get_version_fields( self ):
+ return ( self._loaded_content_version, self.data )
def parse_column_spec( self, config_element ):
"""
@@ -324,6 +340,61 @@
rval = fields[ return_col ]
break
return rval
-
+
+ def _add_entry( self, entry, persist=False, persist_on_error=False, **kwd ):
+ #accepts dict or list of columns
+ if isinstance( entry, dict ):
+ fields = []
+ for column_name in self.get_column_name_list():
+ if column_name not in entry:
+ log.debug( "Using default column value for column '%s' when adding data table entry (%s) to table '%s'.", column_name, entry, self.name )
+ field_value = self.get_empty_field_by_name( column_name )
+ else:
+ field_value = entry[ column_name ]
+ fields.append( field_value )
+ else:
+ fields = entry
+ if self.largest_index < len( fields ):
+ fields = self._replace_field_separators( fields )
+ self.data.append( fields )
+ field_len_error = False
+ else:
+ log.error( "Attempted to add fields (%s) to data table '%s', but there were not enough fields specified ( %i < %i ).", fields, self.name, len( fields ), self.largest_index + 1 )
+ field_len_error = True
+ if persist and ( not field_len_error or persist_on_error ):
+ #FIXME: Need to lock these files for editing
+ try:
+ data_table_fh = open( self.filename, 'r+b' )
+ except IOError, e:
+ log.warning( 'Error opening data table file (%s) with r+b, assuming file does not exist and will open as wb: %s', self.filename, e )
+ data_table_fh = open( self.filename, 'wb' )
+ if os.stat( self.filename )[6] != 0:
+ # ensure last existing line ends with new line
+ data_table_fh.seek( -1, 2 ) #last char in file
+ last_char = data_table_fh.read( 1 )
+ if last_char not in [ '\n', '\r' ]:
+ data_table_fh.write( '\n' )
+ data_table_fh.write( "%s\n" % ( self.separator.join( fields ) ) )
+ return not field_len_error
+
+ def _replace_field_separators( self, fields, separator=None, replace=None, comment_char=None ):
+ #make sure none of the fields contain separator
+ #make sure separator replace is different from comment_char,
+ #due to possible leading replace
+ if separator is None:
+ separator = self.separator
+ if replace is None:
+ if separator == " ":
+ if comment_char == "\t":
+ replace = "_"
+ else:
+ replace = "\t"
+ else:
+ if comment_char == " ":
+ replace = "_"
+ else:
+ replace = " "
+ return map( lambda x: x.replace( separator, replace ), fields )
+
# Registry of tool data types by type_key
tool_data_table_types = dict( [ ( cls.type_key, cls ) for cls in [ TabularToolDataTable ] ] )
diff -r 7e820c43d641d784d73526637e3d55ede5f8327e -r b13e85effb678a5f820630bee89a6a233299b5b8 lib/galaxy/tools/data_manager/manager.py
--- a/lib/galaxy/tools/data_manager/manager.py
+++ b/lib/galaxy/tools/data_manager/manager.py
@@ -233,59 +233,21 @@
assert output_ref_dataset is not None, "Referenced output was not found."
output_ref_values[ data_table_column ] = output_ref_dataset
- final_data_table_values = []
if not isinstance( data_table_values, list ):
data_table_values = [ data_table_values ]
- columns = data_table.get_column_name_list()
- #FIXME: Need to lock these files for editing
- try:
- data_table_fh = open( data_table.filename, 'r+b' )
- except IOError, e:
- log.warning( 'Error opening data table file (%s) with r+b, assuming file does not exist and will open as wb: %s' % ( data_table.filename, e ) )
- data_table_fh = open( data_table.filename, 'wb' )
- if os.stat( data_table.filename )[6] != 0:
- # ensure last existing line ends with new line
- data_table_fh.seek( -1, 2 ) #last char in file
- last_char = data_table_fh.read()
- if last_char not in [ '\n', '\r' ]:
- data_table_fh.write( '\n' )
for data_table_row in data_table_values:
data_table_value = dict( **data_table_row ) #keep original values here
for name, value in data_table_row.iteritems(): #FIXME: need to loop through here based upon order listed in data_manager config
if name in output_ref_values:
moved = self.process_move( data_table_name, name, output_ref_values[ name ].extra_files_path, **data_table_value )
data_table_value[ name ] = self.process_value_translation( data_table_name, name, **data_table_value )
- final_data_table_values.append( data_table_value )
- fields = []
- for column_name in columns:
- if column_name is None or column_name not in data_table_value:
- fields.append( data_table.get_empty_field_by_name( column_name ) )
- else:
- fields.append( data_table_value[ column_name ] )
- #should we add a comment to file about automatically generated value here?
- data_table_fh.write( "%s\n" % ( data_table.separator.join( self._replace_field_separators( fields, separator=data_table.separator ) ) ) ) #write out fields to disk
- data_table.data.append( fields ) #add fields to loaded data table
- data_table_fh.close()
+ data_table.add_entry( data_table_value, persist=True )
+
for data_table_name, data_table_values in data_tables_dict.iteritems():
#tool returned extra data table entries, but data table was not declared in data manager
#do not add these values, but do provide messages
log.warning( 'The data manager "%s" returned an undeclared data table "%s" with new entries "%s". These entries will not be created. Please confirm that an entry for "%s" exists in your "%s" file.' % ( self.id, data_table_name, data_table_values, data_table_name, self.data_managers.filename ) )
- def _replace_field_separators( self, fields, separator="\t", replace=None, comment_char=None ):
- #make sure none of the fields contain separator
- #make sure separator replace is different from comment_char,
- #due to possible leading replace
- if replace is None:
- if separator == " ":
- if comment_char == "\t":
- replace = "_"
- else:
- replace = "\t"
- else:
- if comment_char == " ":
- replace = "_"
- else:
- replace = " "
- return map( lambda x: x.replace( separator, replace ), fields )
+
def process_move( self, data_table_name, column_name, source_base_path, relative_symlinks=False, **kwd ):
if data_table_name in self.move_by_data_table_column and column_name in self.move_by_data_table_column[ data_table_name ]:
move_dict = self.move_by_data_table_column[ data_table_name ][ column_name ]
diff -r 7e820c43d641d784d73526637e3d55ede5f8327e -r b13e85effb678a5f820630bee89a6a233299b5b8 lib/galaxy/tools/parameters/validation.py
--- a/lib/galaxy/tools/parameters/validation.py
+++ b/lib/galaxy/tools/parameters/validation.py
@@ -293,18 +293,31 @@
if line_startswith:
line_startswith = line_startswith.strip()
return cls( tool_data_table, metadata_name, metadata_column, message, line_startswith )
+
def __init__( self, tool_data_table, metadata_name, metadata_column, message="Value for metadata not found.", line_startswith=None ):
self.metadata_name = metadata_name
self.message = message
self.valid_values = []
+ self._data_table_content_version = None
+ self._tool_data_table = tool_data_table
if isinstance( metadata_column, basestring ):
metadata_column = tool_data_table.columns[ metadata_column ]
- for fields in tool_data_table.get_fields():
- if metadata_column < len( fields ):
- self.valid_values.append( fields[ metadata_column ] )
+ self._metadata_column = metadata_column
+ self._load_values()
+
+ def _load_values( self ):
+ self._data_table_content_version, data_fields = self._tool_data_table.get_version_fields()
+ self.valid_values = []
+ for fields in data_fields:
+ if self._metadata_column < len( fields ):
+ self.valid_values.append( fields[ self._metadata_column ] )
+
def validate( self, value, history = None ):
if not value: return
if hasattr( value, "metadata" ):
+ if not self._tool_data_table.is_current_version( self._data_table_content_version ):
+ log.debug( 'MetadataInDataTableColumnValidator values are out of sync with data table (%s), updating validator.', self._tool_data_table.name )
+ self._load_values()
if value.metadata.spec[self.metadata_name].param.to_string( value.metadata.get( self.metadata_name ) ) in self.valid_values:
return
raise ValueError( self.message )
https://bitbucket.org/galaxy/galaxy-central/commits/ec277b165b13/
Changeset: ec277b165b13
Branch: next-stable
User: dan
Date: 2013-05-24 19:04:32
Summary: Fix for MetadataInDataTableColumnValidator becoming stale when a tool data table is updated during Galaxy runtime.
Affected #: 3 files
diff -r 08b05ff1c7cfaad179d25970df0b740da282550e -r ec277b165b13131d02998c3cbab3b3e664141a58 lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -161,10 +161,23 @@
self.tool_data_file = None
self.tool_data_path = tool_data_path
self.missing_index_file = None
+ # increment this variable any time a new entry is added, or when the table is totally reloaded
+ # This value has no external meaning, and does not represent an abstract version of the underlying data
+ self._loaded_content_version = 1
def get_empty_field_by_name( self, name ):
return self.empty_field_values.get( name, self.empty_field_value )
-
+
+ def _add_entry( self, entry, persist=False, persist_on_error=False, **kwd ):
+ raise NotImplementedError( "Abstract method" )
+
+ def add_entry( self, entry, persist=False, persist_on_error=False, **kwd ):
+ self._add_entry( entry, persist=persist, persist_on_error=persist_on_error, **kwd )
+ self._loaded_content_version += 1
+ return self._loaded_content_version
+
+ def is_current_version( self, other_version ):
+ return self._loaded_content_version == other_version
class TabularToolDataTable( ToolDataTable ):
"""
@@ -234,6 +247,9 @@
def get_fields( self ):
return self.data
+
+ def get_version_fields( self ):
+ return ( self._loaded_content_version, self.data )
def parse_column_spec( self, config_element ):
"""
@@ -324,6 +340,61 @@
rval = fields[ return_col ]
break
return rval
-
+
+ def _add_entry( self, entry, persist=False, persist_on_error=False, **kwd ):
+ #accepts dict or list of columns
+ if isinstance( entry, dict ):
+ fields = []
+ for column_name in self.get_column_name_list():
+ if column_name not in entry:
+ log.debug( "Using default column value for column '%s' when adding data table entry (%s) to table '%s'.", column_name, entry, self.name )
+ field_value = self.get_empty_field_by_name( column_name )
+ else:
+ field_value = entry[ column_name ]
+ fields.append( field_value )
+ else:
+ fields = entry
+ if self.largest_index < len( fields ):
+ fields = self._replace_field_separators( fields )
+ self.data.append( fields )
+ field_len_error = False
+ else:
+ log.error( "Attempted to add fields (%s) to data table '%s', but there were not enough fields specified ( %i < %i ).", fields, self.name, len( fields ), self.largest_index + 1 )
+ field_len_error = True
+ if persist and ( not field_len_error or persist_on_error ):
+ #FIXME: Need to lock these files for editing
+ try:
+ data_table_fh = open( self.filename, 'r+b' )
+ except IOError, e:
+ log.warning( 'Error opening data table file (%s) with r+b, assuming file does not exist and will open as wb: %s', self.filename, e )
+ data_table_fh = open( self.filename, 'wb' )
+ if os.stat( self.filename )[6] != 0:
+ # ensure last existing line ends with new line
+ data_table_fh.seek( -1, 2 ) #last char in file
+ last_char = data_table_fh.read( 1 )
+ if last_char not in [ '\n', '\r' ]:
+ data_table_fh.write( '\n' )
+ data_table_fh.write( "%s\n" % ( self.separator.join( fields ) ) )
+ return not field_len_error
+
+ def _replace_field_separators( self, fields, separator=None, replace=None, comment_char=None ):
+ #make sure none of the fields contain separator
+ #make sure separator replace is different from comment_char,
+ #due to possible leading replace
+ if separator is None:
+ separator = self.separator
+ if replace is None:
+ if separator == " ":
+ if comment_char == "\t":
+ replace = "_"
+ else:
+ replace = "\t"
+ else:
+ if comment_char == " ":
+ replace = "_"
+ else:
+ replace = " "
+ return map( lambda x: x.replace( separator, replace ), fields )
+
# Registry of tool data types by type_key
tool_data_table_types = dict( [ ( cls.type_key, cls ) for cls in [ TabularToolDataTable ] ] )
diff -r 08b05ff1c7cfaad179d25970df0b740da282550e -r ec277b165b13131d02998c3cbab3b3e664141a58 lib/galaxy/tools/data_manager/manager.py
--- a/lib/galaxy/tools/data_manager/manager.py
+++ b/lib/galaxy/tools/data_manager/manager.py
@@ -233,59 +233,21 @@
assert output_ref_dataset is not None, "Referenced output was not found."
output_ref_values[ data_table_column ] = output_ref_dataset
- final_data_table_values = []
if not isinstance( data_table_values, list ):
data_table_values = [ data_table_values ]
- columns = data_table.get_column_name_list()
- #FIXME: Need to lock these files for editing
- try:
- data_table_fh = open( data_table.filename, 'r+b' )
- except IOError, e:
- log.warning( 'Error opening data table file (%s) with r+b, assuming file does not exist and will open as wb: %s' % ( data_table.filename, e ) )
- data_table_fh = open( data_table.filename, 'wb' )
- if os.stat( data_table.filename )[6] != 0:
- # ensure last existing line ends with new line
- data_table_fh.seek( -1, 2 ) #last char in file
- last_char = data_table_fh.read()
- if last_char not in [ '\n', '\r' ]:
- data_table_fh.write( '\n' )
for data_table_row in data_table_values:
data_table_value = dict( **data_table_row ) #keep original values here
for name, value in data_table_row.iteritems(): #FIXME: need to loop through here based upon order listed in data_manager config
if name in output_ref_values:
moved = self.process_move( data_table_name, name, output_ref_values[ name ].extra_files_path, **data_table_value )
data_table_value[ name ] = self.process_value_translation( data_table_name, name, **data_table_value )
- final_data_table_values.append( data_table_value )
- fields = []
- for column_name in columns:
- if column_name is None or column_name not in data_table_value:
- fields.append( data_table.get_empty_field_by_name( column_name ) )
- else:
- fields.append( data_table_value[ column_name ] )
- #should we add a comment to file about automatically generated value here?
- data_table_fh.write( "%s\n" % ( data_table.separator.join( self._replace_field_separators( fields, separator=data_table.separator ) ) ) ) #write out fields to disk
- data_table.data.append( fields ) #add fields to loaded data table
- data_table_fh.close()
+ data_table.add_entry( data_table_value, persist=True )
+
for data_table_name, data_table_values in data_tables_dict.iteritems():
#tool returned extra data table entries, but data table was not declared in data manager
#do not add these values, but do provide messages
log.warning( 'The data manager "%s" returned an undeclared data table "%s" with new entries "%s". These entries will not be created. Please confirm that an entry for "%s" exists in your "%s" file.' % ( self.id, data_table_name, data_table_values, data_table_name, self.data_managers.filename ) )
- def _replace_field_separators( self, fields, separator="\t", replace=None, comment_char=None ):
- #make sure none of the fields contain separator
- #make sure separator replace is different from comment_char,
- #due to possible leading replace
- if replace is None:
- if separator == " ":
- if comment_char == "\t":
- replace = "_"
- else:
- replace = "\t"
- else:
- if comment_char == " ":
- replace = "_"
- else:
- replace = " "
- return map( lambda x: x.replace( separator, replace ), fields )
+
def process_move( self, data_table_name, column_name, source_base_path, relative_symlinks=False, **kwd ):
if data_table_name in self.move_by_data_table_column and column_name in self.move_by_data_table_column[ data_table_name ]:
move_dict = self.move_by_data_table_column[ data_table_name ][ column_name ]
diff -r 08b05ff1c7cfaad179d25970df0b740da282550e -r ec277b165b13131d02998c3cbab3b3e664141a58 lib/galaxy/tools/parameters/validation.py
--- a/lib/galaxy/tools/parameters/validation.py
+++ b/lib/galaxy/tools/parameters/validation.py
@@ -293,18 +293,31 @@
if line_startswith:
line_startswith = line_startswith.strip()
return cls( tool_data_table, metadata_name, metadata_column, message, line_startswith )
+
def __init__( self, tool_data_table, metadata_name, metadata_column, message="Value for metadata not found.", line_startswith=None ):
self.metadata_name = metadata_name
self.message = message
self.valid_values = []
+ self._data_table_content_version = None
+ self._tool_data_table = tool_data_table
if isinstance( metadata_column, basestring ):
metadata_column = tool_data_table.columns[ metadata_column ]
- for fields in tool_data_table.get_fields():
- if metadata_column < len( fields ):
- self.valid_values.append( fields[ metadata_column ] )
+ self._metadata_column = metadata_column
+ self._load_values()
+
+ def _load_values( self ):
+ self._data_table_content_version, data_fields = self._tool_data_table.get_version_fields()
+ self.valid_values = []
+ for fields in data_fields:
+ if self._metadata_column < len( fields ):
+ self.valid_values.append( fields[ self._metadata_column ] )
+
def validate( self, value, history = None ):
if not value: return
if hasattr( value, "metadata" ):
+ if not self._tool_data_table.is_current_version( self._data_table_content_version ):
+ log.debug( 'MetadataInDataTableColumnValidator values are out of sync with data table (%s), updating validator.', self._tool_data_table.name )
+ self._load_values()
if value.metadata.spec[self.metadata_name].param.to_string( value.metadata.get( self.metadata_name ) ) in self.valid_values:
return
raise ValueError( self.message )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Dave Bouvier: Enhance the install and test framework to support testing a single changeset revision of a repository. This defaults to testing all repositories and changeset revisions determined to be installable and testable.
by commits-noreply@bitbucket.org 24 May '13
by commits-noreply@bitbucket.org 24 May '13
24 May '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/7e820c43d641/
Changeset: 7e820c43d641
User: Dave Bouvier
Date: 2013-05-24 17:26:19
Summary: Enhance the install and test framework to support testing a single changeset revision of a repository. This defaults to testing all repositories and changeset revisions determined to be installable and testable.
Affected #: 1 file
diff -r cf574a68a1f95524b9412e0944ee2cd705a63ce5 -r 7e820c43d641d784d73526637e3d55ede5f8327e test/install_and_test_tool_shed_repositories/functional_tests.py
--- a/test/install_and_test_tool_shed_repositories/functional_tests.py
+++ b/test/install_and_test_tool_shed_repositories/functional_tests.py
@@ -145,7 +145,15 @@
else:
galaxy_encode_secret = os.environ[ 'GALAXY_INSTALL_TEST_SECRET' ]
-
+testing_single_repository = {}
+if 'repository_name' in os.environ and 'repository_owner' in os.environ:
+ testing_single_repository[ 'name' ] = os.environ[ 'repository_name' ]
+ testing_single_repository[ 'owner' ] = os.environ[ 'repository_owner' ]
+ if 'repository_revision' in os.environ:
+ testing_single_repository[ 'changeset_revision' ] = os.environ[ 'repository_revision' ]
+ else:
+ testing_single_repository[ 'changeset_revision' ] = None
+
class ReportResults( Plugin ):
'''Simple Nose plugin to record the IDs of all tests run, regardless of success.'''
name = "reportresults"
@@ -308,6 +316,19 @@
skipped_previous = ' and metadata revisions that are not the most recent'
else:
skipped_previous = ''
+ if testing_single_repository:
+ log.info( 'Testing single repository with name %s and owner %s.',
+ testing_single_repository[ 'name' ],
+ testing_single_repository[ 'owner' ])
+ for repository_to_install in detailed_repository_list:
+ if repository_to_install[ 'name' ] == testing_single_repository[ 'name' ] \
+ and repository_to_install[ 'owner' ] == testing_single_repository[ 'owner' ]:
+ if testing_single_repository[ 'changeset_revision' ] is None:
+ return [ repository_to_install ]
+ else:
+ if testing_single_repository[ 'changeset_revision' ] == repository_to_install[ 'changeset_revision' ]:
+ return [ repository_to_install ]
+ return []
log.info( 'After removing deleted repositories%s from the list, %d remain to be tested.', skipped_previous, repositories_tested )
return detailed_repository_list
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/08b05ff1c7cf/
Changeset: 08b05ff1c7cf
Branch: next-stable
User: Dave Bouvier
Date: 2013-05-23 21:49:46
Summary: Fix of environment variables being set in set_environment_for_install, but not being propagated to set_environment actions.
Affected #: 1 file
diff -r 5ee2362aff09f103891075c218f7474a662e569b -r 08b05ff1c7cfaad179d25970df0b740da282550e lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -22,6 +22,7 @@
log = logging.getLogger( __name__ )
+CMD_SEPARATOR = '__CMD_SEP__'
INSTALLATION_LOG = 'INSTALLATION.log'
VIRTUALENV_URL = 'https://pypi.python.org/packages/source/v/virtualenv/virtualenv-1.9.1.tar.gz'
@@ -49,6 +50,29 @@
return output
return output.return_code
+def handle_environment_variables( app, tool_dependency, install_dir, env_var_dict, set_prior_environment_commands ):
+ env_var_value = env_var_dict[ 'value' ]
+ if '$ENV[' in env_var_value and ']' in env_var_value:
+ # Pull out the name of the environment variable to populate.
+ inherited_env_var_name = env_var_value.split( '[' )[1].split( ']' )[0]
+ to_replace = '$ENV[%s]' % inherited_env_var_name
+ # Build a command line that outputs CMD_SEPARATOR + environment variable value + CMD_SEPARATOR.
+ set_prior_environment_commands.extend( [ "echo '%s'" % CMD_SEPARATOR, 'echo $%s' % inherited_env_var_name, "echo '%s'" % CMD_SEPARATOR ] )
+ command = ' ; '.join( set_prior_environment_commands )
+ # Run the command and capture the output.
+ command_return = handle_command( app, tool_dependency, install_dir, command, return_output=True )
+ # And extract anything between the two instances of CMD_SEPARATOR.
+ environment_variable_value = command_return.split( CMD_SEPARATOR )[1].split( CMD_SEPARATOR )[0].strip( '\n' )
+ if environment_variable_value:
+ log.info( 'Replacing %s with %s in env.sh for this repository.', to_replace, environment_variable_value )
+ env_var_value = env_var_value.replace( to_replace, environment_variable_value )
+ else:
+ # If the return is empty, replace the original $ENV[] with nothing, to avoid any shell misparsings later on.
+ log.error( 'Environment variable %s not found, removing from set_environment.', inherited_env_var_name )
+ env_var_value = env_var_value.replace( to_replace, '$%s' % inherited_env_var_name )
+ env_var_dict[ 'value' ] = env_var_value
+ return env_var_dict
+
def install_virtualenv( app, venv_dir ):
if not os.path.exists( venv_dir ):
with make_tmp_dir() as work_dir:
@@ -149,10 +173,18 @@
destination_dir=os.path.join( action_dict[ 'destination' ] ) )
elif action_type == 'set_environment':
# Currently the only action supported in this category is "environment_variable".
+ # Build a command line from the prior_installation_required, in case an environment variable is referenced
+ # in the set_environment action.
+ cmds = []
+ for env_shell_file_path in env_shell_file_paths:
+ for i, env_setting in enumerate( open( env_shell_file_path ) ):
+ cmds.append( env_setting.strip( '\n' ) )
env_var_dicts = action_dict[ 'environment_variable' ]
for env_var_dict in env_var_dicts:
- cmd = common_util.create_or_update_env_shell_file( install_dir, env_var_dict )
- return_code = handle_command( app, tool_dependency, install_dir, cmd )
+ # Check for the presence of the $ENV[] key string and populate it if possible.
+ env_var_dict = handle_environment_variables( app, tool_dependency, install_dir, env_var_dict, cmds )
+ env_command = common_util.create_or_update_env_shell_file( install_dir, env_var_dict )
+ return_code = handle_command( app, tool_dependency, install_dir, env_command )
if return_code:
return
elif action_type == 'set_environment_for_install':
https://bitbucket.org/galaxy/galaxy-central/commits/cf574a68a1f9/
Changeset: cf574a68a1f9
User: Dave Bouvier
Date: 2013-05-23 21:50:34
Summary: Merged in next-stable.
Affected #: 1 file
diff -r 681e2c7658482d6148fad734c5080c164efae8b7 -r cf574a68a1f95524b9412e0944ee2cd705a63ce5 lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -22,6 +22,7 @@
log = logging.getLogger( __name__ )
+CMD_SEPARATOR = '__CMD_SEP__'
INSTALLATION_LOG = 'INSTALLATION.log'
VIRTUALENV_URL = 'https://pypi.python.org/packages/source/v/virtualenv/virtualenv-1.9.1.tar.gz'
@@ -49,6 +50,29 @@
return output
return output.return_code
+def handle_environment_variables( app, tool_dependency, install_dir, env_var_dict, set_prior_environment_commands ):
+ env_var_value = env_var_dict[ 'value' ]
+ if '$ENV[' in env_var_value and ']' in env_var_value:
+ # Pull out the name of the environment variable to populate.
+ inherited_env_var_name = env_var_value.split( '[' )[1].split( ']' )[0]
+ to_replace = '$ENV[%s]' % inherited_env_var_name
+ # Build a command line that outputs CMD_SEPARATOR + environment variable value + CMD_SEPARATOR.
+ set_prior_environment_commands.extend( [ "echo '%s'" % CMD_SEPARATOR, 'echo $%s' % inherited_env_var_name, "echo '%s'" % CMD_SEPARATOR ] )
+ command = ' ; '.join( set_prior_environment_commands )
+ # Run the command and capture the output.
+ command_return = handle_command( app, tool_dependency, install_dir, command, return_output=True )
+ # And extract anything between the two instances of CMD_SEPARATOR.
+ environment_variable_value = command_return.split( CMD_SEPARATOR )[1].split( CMD_SEPARATOR )[0].strip( '\n' )
+ if environment_variable_value:
+ log.info( 'Replacing %s with %s in env.sh for this repository.', to_replace, environment_variable_value )
+ env_var_value = env_var_value.replace( to_replace, environment_variable_value )
+ else:
+ # If the return is empty, replace the original $ENV[] with nothing, to avoid any shell misparsings later on.
+ log.error( 'Environment variable %s not found, removing from set_environment.', inherited_env_var_name )
+ env_var_value = env_var_value.replace( to_replace, '$%s' % inherited_env_var_name )
+ env_var_dict[ 'value' ] = env_var_value
+ return env_var_dict
+
def install_virtualenv( app, venv_dir ):
if not os.path.exists( venv_dir ):
with make_tmp_dir() as work_dir:
@@ -149,10 +173,18 @@
destination_dir=os.path.join( action_dict[ 'destination' ] ) )
elif action_type == 'set_environment':
# Currently the only action supported in this category is "environment_variable".
+ # Build a command line from the prior_installation_required, in case an environment variable is referenced
+ # in the set_environment action.
+ cmds = []
+ for env_shell_file_path in env_shell_file_paths:
+ for i, env_setting in enumerate( open( env_shell_file_path ) ):
+ cmds.append( env_setting.strip( '\n' ) )
env_var_dicts = action_dict[ 'environment_variable' ]
for env_var_dict in env_var_dicts:
- cmd = common_util.create_or_update_env_shell_file( install_dir, env_var_dict )
- return_code = handle_command( app, tool_dependency, install_dir, cmd )
+ # Check for the presence of the $ENV[] key string and populate it if possible.
+ env_var_dict = handle_environment_variables( app, tool_dependency, install_dir, env_var_dict, cmds )
+ env_command = common_util.create_or_update_env_shell_file( install_dir, env_var_dict )
+ return_code = handle_command( app, tool_dependency, install_dir, env_command )
if return_code:
return
elif action_type == 'set_environment_for_install':
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/5ee2362aff09/
Changeset: 5ee2362aff09
Branch: next-stable
User: carlfeberhard
Date: 2013-05-23 21:44:17
Summary: Remove access check from get_hda_dict; Add profiler version of same
Affected #: 3 files
diff -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 -r 5ee2362aff09f103891075c218f7474a662e569b lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -540,18 +540,19 @@
def get_hda_dict( self, trans, hda ):
"""Return full details of this HDA in dictionary form.
"""
+ #precondition: the user's access to this hda has already been checked
+ #TODO:?? postcondition: all ids are encoded (is this really what we want at this level?)
hda_dict = hda.get_api_value( view='element' )
- history = hda.history
hda_dict[ 'api_type' ] = "file"
# Add additional attributes that depend on trans can hence must be added here rather than at the model level.
- can_access_hda = trans.app.security_agent.can_access_dataset( trans.get_current_user_roles(), hda.dataset )
- can_access_hda = ( trans.user_is_admin() or can_access_hda )
- hda_dict[ 'accessible' ] = can_access_hda
+
+ #NOTE: access is an expensive operation - removing it and adding the precondition of access is already checked
+ hda_dict[ 'accessible' ] = True
# ---- return here if deleted AND purged OR can't access
purged = ( hda.purged or hda.dataset.purged )
- if ( hda.deleted and purged ) or not can_access_hda:
+ if ( hda.deleted and purged ):
#TODO: get_api_value should really go AFTER this - only summary data
return trans.security.encode_dict_ids( hda_dict )
@@ -559,7 +560,7 @@
hda_dict[ 'file_name' ] = hda.file_name
hda_dict[ 'download_url' ] = url_for( 'history_contents_display',
- history_id = trans.security.encode_id( history.id ),
+ history_id = trans.security.encode_id( hda.history.id ),
history_content_id = trans.security.encode_id( hda.id ) )
# indeces, assoc. metadata files, etc.
@@ -592,6 +593,72 @@
return trans.security.encode_dict_ids( hda_dict )
+ def profile_get_hda_dict( self, trans, hda ):
+ """Profiles returning full details of this HDA in dictionary form.
+ """
+ from galaxy.util.debugging import SimpleProfiler
+ profiler = SimpleProfiler()
+ profiler.start()
+
+ hda_dict = hda.get_api_value( view='element' )
+ profiler.report( '\t\t get_api_value' )
+ history = hda.history
+ hda_dict[ 'api_type' ] = "file"
+
+ # Add additional attributes that depend on trans can hence must be added here rather than at the model level.
+ can_access_hda = trans.app.security_agent.can_access_dataset( trans.get_current_user_roles(), hda.dataset )
+ can_access_hda = ( trans.user_is_admin() or can_access_hda )
+ hda_dict[ 'accessible' ] = can_access_hda
+ profiler.report( '\t\t accessible' )
+
+ # ---- return here if deleted AND purged OR can't access
+ purged = ( hda.purged or hda.dataset.purged )
+ if ( hda.deleted and purged ) or not can_access_hda:
+ #TODO: get_api_value should really go AFTER this - only summary data
+ return ( profiler, trans.security.encode_dict_ids( hda_dict ) )
+
+ if trans.user_is_admin() or trans.app.config.expose_dataset_path:
+ hda_dict[ 'file_name' ] = hda.file_name
+ profiler.report( '\t\t file_name' )
+
+ hda_dict[ 'download_url' ] = url_for( 'history_contents_display',
+ history_id = trans.security.encode_id( history.id ),
+ history_content_id = trans.security.encode_id( hda.id ) )
+ profiler.report( '\t\t download_url' )
+
+ # indeces, assoc. metadata files, etc.
+ meta_files = []
+ for meta_type in hda.metadata.spec.keys():
+ if isinstance( hda.metadata.spec[ meta_type ].param, FileParameter ):
+ meta_files.append( dict( file_type=meta_type ) )
+ if meta_files:
+ hda_dict[ 'meta_files' ] = meta_files
+ profiler.report( '\t\t meta_files' )
+
+ # currently, the viz reg is optional - handle on/off
+ if trans.app.visualizations_registry:
+ hda_dict[ 'visualizations' ] = trans.app.visualizations_registry.get_visualizations( trans, hda )
+ else:
+ hda_dict[ 'visualizations' ] = hda.get_visualizations()
+ profiler.report( '\t\t visualizations' )
+ #TODO: it may also be wiser to remove from here and add as API call that loads the visualizations
+ # when the visualizations button is clicked (instead of preloading/pre-checking)
+
+ # ---- return here if deleted
+ if hda.deleted and not purged:
+ return ( profiler, trans.security.encode_dict_ids( hda_dict ) )
+
+ # if a tool declares 'force_history_refresh' in its xml, when the hda -> ready, reload the history panel
+ # expensive
+ if( ( hda.state in [ 'running', 'queued' ] )
+ and ( hda.creating_job and hda.creating_job.tool_id ) ):
+ tool_used = trans.app.toolbox.get_tool( hda.creating_job.tool_id )
+ if tool_used and tool_used.force_history_refresh:
+ hda_dict[ 'force_history_refresh' ] = True
+ profiler.report( '\t\t force_history_refresh' )
+
+ return ( profiler, trans.security.encode_dict_ids( hda_dict ) )
+
def get_hda_dict_with_error( self, trans, hda, error_msg='' ):
return trans.security.encode_dict_ids({
'id' : hda.id,
diff -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 -r 5ee2362aff09f103891075c218f7474a662e569b lib/galaxy/webapps/galaxy/api/history_contents.py
--- a/lib/galaxy/webapps/galaxy/api/history_contents.py
+++ b/lib/galaxy/webapps/galaxy/api/history_contents.py
@@ -54,7 +54,6 @@
hda_dict = self.get_hda_dict( trans, hda )
hda_dict[ 'display_types' ] = self.get_old_display_applications( trans, hda )
hda_dict[ 'display_apps' ] = self.get_display_apps( trans, hda )
- #rval.append( self.get_hda_dict( trans, hda ) )
rval.append( hda_dict )
except Exception, exc:
@@ -115,6 +114,7 @@
check_ownership=False, check_accessible=True )
else:
+ #TODO: do we really need the history?
history = self.get_history( trans, history_id,
check_ownership=True, check_accessible=True, deleted=False )
hda = self.get_history_dataset_association( trans, history, id,
diff -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 -r 5ee2362aff09f103891075c218f7474a662e569b lib/galaxy/webapps/galaxy/controllers/root.py
--- a/lib/galaxy/webapps/galaxy/controllers/root.py
+++ b/lib/galaxy/webapps/galaxy/controllers/root.py
@@ -212,6 +212,7 @@
history_dictionary = {}
hda_dictionaries = []
+ import pprint
try:
history = trans.get_history( create=True )
profiler.report( 'trans.get_history' )
@@ -221,8 +222,10 @@
for hda in hdas:
try:
- hda_dictionaries.append( self.get_hda_dict( trans, hda ) )
+ ( hda_profiler, hda_dict ) = self.profile_get_hda_dict( trans, hda )
+ profiler.reports.extend( hda_profiler.get_reports() )
profiler.report( '\t hda -> dictionary (%s)' %( hda.name ) )
+ hda_dictionaries.append( hda_dict )
except Exception, exc:
# don't fail entire list if hda err's, record and move on
https://bitbucket.org/galaxy/galaxy-central/commits/681e2c765848/
Changeset: 681e2c765848
User: carlfeberhard
Date: 2013-05-23 21:45:11
Summary: merge next-stable
Affected #: 3 files
diff -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 -r 681e2c7658482d6148fad734c5080c164efae8b7 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -540,18 +540,19 @@
def get_hda_dict( self, trans, hda ):
"""Return full details of this HDA in dictionary form.
"""
+ #precondition: the user's access to this hda has already been checked
+ #TODO:?? postcondition: all ids are encoded (is this really what we want at this level?)
hda_dict = hda.get_api_value( view='element' )
- history = hda.history
hda_dict[ 'api_type' ] = "file"
# Add additional attributes that depend on trans can hence must be added here rather than at the model level.
- can_access_hda = trans.app.security_agent.can_access_dataset( trans.get_current_user_roles(), hda.dataset )
- can_access_hda = ( trans.user_is_admin() or can_access_hda )
- hda_dict[ 'accessible' ] = can_access_hda
+
+ #NOTE: access is an expensive operation - removing it and adding the precondition of access is already checked
+ hda_dict[ 'accessible' ] = True
# ---- return here if deleted AND purged OR can't access
purged = ( hda.purged or hda.dataset.purged )
- if ( hda.deleted and purged ) or not can_access_hda:
+ if ( hda.deleted and purged ):
#TODO: get_api_value should really go AFTER this - only summary data
return trans.security.encode_dict_ids( hda_dict )
@@ -559,7 +560,7 @@
hda_dict[ 'file_name' ] = hda.file_name
hda_dict[ 'download_url' ] = url_for( 'history_contents_display',
- history_id = trans.security.encode_id( history.id ),
+ history_id = trans.security.encode_id( hda.history.id ),
history_content_id = trans.security.encode_id( hda.id ) )
# indeces, assoc. metadata files, etc.
@@ -592,6 +593,72 @@
return trans.security.encode_dict_ids( hda_dict )
+ def profile_get_hda_dict( self, trans, hda ):
+ """Profiles returning full details of this HDA in dictionary form.
+ """
+ from galaxy.util.debugging import SimpleProfiler
+ profiler = SimpleProfiler()
+ profiler.start()
+
+ hda_dict = hda.get_api_value( view='element' )
+ profiler.report( '\t\t get_api_value' )
+ history = hda.history
+ hda_dict[ 'api_type' ] = "file"
+
+ # Add additional attributes that depend on trans can hence must be added here rather than at the model level.
+ can_access_hda = trans.app.security_agent.can_access_dataset( trans.get_current_user_roles(), hda.dataset )
+ can_access_hda = ( trans.user_is_admin() or can_access_hda )
+ hda_dict[ 'accessible' ] = can_access_hda
+ profiler.report( '\t\t accessible' )
+
+ # ---- return here if deleted AND purged OR can't access
+ purged = ( hda.purged or hda.dataset.purged )
+ if ( hda.deleted and purged ) or not can_access_hda:
+ #TODO: get_api_value should really go AFTER this - only summary data
+ return ( profiler, trans.security.encode_dict_ids( hda_dict ) )
+
+ if trans.user_is_admin() or trans.app.config.expose_dataset_path:
+ hda_dict[ 'file_name' ] = hda.file_name
+ profiler.report( '\t\t file_name' )
+
+ hda_dict[ 'download_url' ] = url_for( 'history_contents_display',
+ history_id = trans.security.encode_id( history.id ),
+ history_content_id = trans.security.encode_id( hda.id ) )
+ profiler.report( '\t\t download_url' )
+
+ # indeces, assoc. metadata files, etc.
+ meta_files = []
+ for meta_type in hda.metadata.spec.keys():
+ if isinstance( hda.metadata.spec[ meta_type ].param, FileParameter ):
+ meta_files.append( dict( file_type=meta_type ) )
+ if meta_files:
+ hda_dict[ 'meta_files' ] = meta_files
+ profiler.report( '\t\t meta_files' )
+
+ # currently, the viz reg is optional - handle on/off
+ if trans.app.visualizations_registry:
+ hda_dict[ 'visualizations' ] = trans.app.visualizations_registry.get_visualizations( trans, hda )
+ else:
+ hda_dict[ 'visualizations' ] = hda.get_visualizations()
+ profiler.report( '\t\t visualizations' )
+ #TODO: it may also be wiser to remove from here and add as API call that loads the visualizations
+ # when the visualizations button is clicked (instead of preloading/pre-checking)
+
+ # ---- return here if deleted
+ if hda.deleted and not purged:
+ return ( profiler, trans.security.encode_dict_ids( hda_dict ) )
+
+ # if a tool declares 'force_history_refresh' in its xml, when the hda -> ready, reload the history panel
+ # expensive
+ if( ( hda.state in [ 'running', 'queued' ] )
+ and ( hda.creating_job and hda.creating_job.tool_id ) ):
+ tool_used = trans.app.toolbox.get_tool( hda.creating_job.tool_id )
+ if tool_used and tool_used.force_history_refresh:
+ hda_dict[ 'force_history_refresh' ] = True
+ profiler.report( '\t\t force_history_refresh' )
+
+ return ( profiler, trans.security.encode_dict_ids( hda_dict ) )
+
def get_hda_dict_with_error( self, trans, hda, error_msg='' ):
return trans.security.encode_dict_ids({
'id' : hda.id,
diff -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 -r 681e2c7658482d6148fad734c5080c164efae8b7 lib/galaxy/webapps/galaxy/api/history_contents.py
--- a/lib/galaxy/webapps/galaxy/api/history_contents.py
+++ b/lib/galaxy/webapps/galaxy/api/history_contents.py
@@ -54,7 +54,6 @@
hda_dict = self.get_hda_dict( trans, hda )
hda_dict[ 'display_types' ] = self.get_old_display_applications( trans, hda )
hda_dict[ 'display_apps' ] = self.get_display_apps( trans, hda )
- #rval.append( self.get_hda_dict( trans, hda ) )
rval.append( hda_dict )
except Exception, exc:
@@ -115,6 +114,7 @@
check_ownership=False, check_accessible=True )
else:
+ #TODO: do we really need the history?
history = self.get_history( trans, history_id,
check_ownership=True, check_accessible=True, deleted=False )
hda = self.get_history_dataset_association( trans, history, id,
diff -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 -r 681e2c7658482d6148fad734c5080c164efae8b7 lib/galaxy/webapps/galaxy/controllers/root.py
--- a/lib/galaxy/webapps/galaxy/controllers/root.py
+++ b/lib/galaxy/webapps/galaxy/controllers/root.py
@@ -212,6 +212,7 @@
history_dictionary = {}
hda_dictionaries = []
+ import pprint
try:
history = trans.get_history( create=True )
profiler.report( 'trans.get_history' )
@@ -221,8 +222,10 @@
for hda in hdas:
try:
- hda_dictionaries.append( self.get_hda_dict( trans, hda ) )
+ ( hda_profiler, hda_dict ) = self.profile_get_hda_dict( trans, hda )
+ profiler.reports.extend( hda_profiler.get_reports() )
profiler.report( '\t hda -> dictionary (%s)' %( hda.name ) )
+ hda_dictionaries.append( hda_dict )
except Exception, exc:
# don't fail entire list if hda err's, record and move on
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/ad8d4f2080c0/
Changeset: ad8d4f2080c0
Branch: next-stable
User: greg
Date: 2013-05-23 20:57:12
Summary: Fixes for xml handling in the tol shed.
Affected #: 12 files
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -14,6 +14,7 @@
from tool_shed.util import repository_dependency_util
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
+from tool_shed.util import xml_util
from galaxy import eggs
eggs.require( 'mercurial' )
@@ -131,7 +132,7 @@
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
@@ -140,7 +141,7 @@
# are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
@@ -268,13 +269,13 @@
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
if ok:
repo_path = os.path.join( full_path, relative_path )
@@ -330,13 +331,13 @@
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
return commit_util.handle_directory_changes( trans,
repository,
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/galaxy_install/install_manager.py
--- a/lib/tool_shed/galaxy_install/install_manager.py
+++ b/lib/tool_shed/galaxy_install/install_manager.py
@@ -15,6 +15,7 @@
from tool_shed.util import metadata_util
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
+from tool_shed.util import xml_util
from galaxy.util.odict import odict
@@ -37,13 +38,13 @@
self.proprietary_tool_confs = self.non_shed_tool_panel_configs
self.proprietary_tool_panel_elems = self.get_proprietary_tool_panel_elems( latest_migration_script_number )
# Set the location where the repositories will be installed by retrieving the tool_path setting from migrated_tools_config.
- tree = util.parse_xml( migrated_tools_config )
+ tree = xml_util.parse_xml( migrated_tools_config )
root = tree.getroot()
self.tool_path = root.get( 'tool_path' )
print "Repositories will be installed into configured tool_path location ", str( self.tool_path )
# Parse tool_shed_install_config to check each of the tools.
self.tool_shed_install_config = tool_shed_install_config
- tree = util.parse_xml( tool_shed_install_config )
+ tree = xml_util.parse_xml( tool_shed_install_config )
root = tree.getroot()
self.tool_shed = suc.clean_tool_shed_url( root.get( 'name' ) )
self.repository_owner = common_util.REPOSITORY_OWNER
@@ -107,7 +108,7 @@
tools_xml_file_path = os.path.abspath( os.path.join( 'scripts', 'migrate_tools', '%04d_tools.xml' % latest_tool_migration_script_number ) )
# Parse the XML and load the file attributes for later checking against the integrated elements from self.proprietary_tool_confs.
migrated_tool_configs = []
- tree = util.parse_xml( tools_xml_file_path )
+ tree = xml_util.parse_xml( tools_xml_file_path )
root = tree.getroot()
for elem in root:
if elem.tag == 'repository':
@@ -116,7 +117,7 @@
# Parse each file in self.proprietary_tool_confs and generate the integrated list of tool panel Elements that contain them.
tool_panel_elems = []
for proprietary_tool_conf in self.proprietary_tool_confs:
- tree = util.parse_xml( proprietary_tool_conf )
+ tree = xml_util.parse_xml( proprietary_tool_conf )
root = tree.getroot()
for elem in root:
if elem.tag == 'tool':
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -10,6 +10,7 @@
import tool_shed.util.common_util as cu
from tool_shed.util import encoding_util
from tool_shed.util import tool_dependency_util
+from tool_shed.util import xml_util
from galaxy.model.orm import and_
from galaxy.web import url_for
@@ -478,7 +479,7 @@
tool_dependency = None
action_dict = {}
if tool_dependencies_config:
- required_td_tree = parse_xml( tool_dependencies_config )
+ required_td_tree = xml_util.parse_xml( tool_dependencies_config )
if required_td_tree:
required_td_root = required_td_tree.getroot()
for required_td_elem in required_td_root:
@@ -623,17 +624,6 @@
file_name = fpath
return file_name
-def parse_xml( file_name ):
- """Returns a parsed xml tree."""
- try:
- tree = ElementTree.parse( file_name )
- except Exception, e:
- print "Exception attempting to parse ", file_name, ": ", str( e )
- return None
- root = tree.getroot()
- ElementInclude.include( root )
- return tree
-
def url_join( *args ):
parts = []
for arg in args:
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/tool_shed_registry.py
--- a/lib/tool_shed/tool_shed_registry.py
+++ b/lib/tool_shed/tool_shed_registry.py
@@ -1,9 +1,8 @@
import logging
import sys
import urllib2
-from galaxy.util import parse_xml
from galaxy.util.odict import odict
-from xml.etree import ElementTree
+from tool_shed.util import xml_util
log = logging.getLogger( __name__ )
@@ -15,7 +14,7 @@
self.tool_sheds_auth = odict()
if root_dir and config:
# Parse tool_sheds_conf.xml
- tree = parse_xml( config )
+ tree = xml_util.parse_xml( config )
root = tree.getroot()
log.debug( 'Loading references to tool sheds from %s' % config )
for elem in root.findall( 'tool_shed' ):
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/commit_util.py
--- a/lib/tool_shed/util/commit_util.py
+++ b/lib/tool_shed/util/commit_util.py
@@ -9,7 +9,7 @@
from galaxy.web import url_for
import tool_shed.util.shed_util_common as suc
from tool_shed.util import tool_util
-import xml.etree.ElementTree
+from tool_shed.util import xml_util
from galaxy import eggs
eggs.require( 'mercurial' )
@@ -60,17 +60,6 @@
message = 'The file "%s" contains image content.\n' % str( file_path )
return message
-def create_and_write_tmp_file( root ):
- tmp_str = '%s\n' % xml.etree.ElementTree.tostring( root, encoding='utf-8' )
- fh = tempfile.NamedTemporaryFile( 'wb' )
- tmp_filename = fh.name
- fh.close()
- fh = open( tmp_filename, 'wb' )
- fh.write( '<?xml version="1.0"?>\n' )
- fh.write( tmp_str )
- fh.close()
- return tmp_filename
-
def get_upload_point( repository, **kwd ):
upload_point = kwd.get( 'upload_point', None )
if upload_point is not None:
@@ -198,7 +187,7 @@
altered = False
try:
# Make sure we're looking at a valid repository_dependencies.xml file.
- tree = suc.parse_xml( repository_dependencies_config )
+ tree = xml_util.parse_xml( repository_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_repository_dependencies_definition: " % str( repository_dependencies_config )
@@ -245,7 +234,7 @@
altered = False
try:
# Make sure we're looking at a valid tool_dependencies.xml file.
- tree = suc.parse_xml( tool_dependencies_config )
+ tree = xml_util.parse_xml( tool_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_tool_dependencies_definition: " % str( tool_dependencies_config )
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/common_util.py
--- a/lib/tool_shed/util/common_util.py
+++ b/lib/tool_shed/util/common_util.py
@@ -1,8 +1,8 @@
import os
import urllib2
-from galaxy import util
from galaxy.util.odict import odict
from tool_shed.util import encoding_util
+from tool_shed.util import xml_util
REPOSITORY_OWNER = 'devteam'
@@ -11,7 +11,7 @@
tools_xml_file_path = os.path.abspath( os.path.join( 'scripts', 'migrate_tools', '%04d_tools.xml' % latest_tool_migration_script_number ) )
# Parse the XML and load the file attributes for later checking against the proprietary tool_panel_config.
migrated_tool_configs_dict = odict()
- tree = util.parse_xml( tools_xml_file_path )
+ tree = xml_util.parse_xml( tools_xml_file_path )
root = tree.getroot()
tool_shed = root.get( 'name' )
tool_shed_url = get_tool_shed_url_from_tools_xml_file_path( app, tool_shed )
@@ -48,7 +48,7 @@
# Parse the proprietary tool_panel_configs (the default is tool_conf.xml) and generate the list of missing tool config file names.
missing_tool_configs_dict = odict()
for tool_panel_config in tool_panel_configs:
- tree = util.parse_xml( tool_panel_config )
+ tree = xml_util.parse_xml( tool_panel_config )
root = tree.getroot()
for elem in root:
if elem.tag == 'tool':
@@ -78,7 +78,7 @@
for config_filename in app.config.tool_configs:
# Any config file that includes a tool_path attribute in the root tag set like the following is shed-related.
# <toolbox tool_path="../shed_tools">
- tree = util.parse_xml( config_filename )
+ tree = xml_util.parse_xml( config_filename )
root = tree.getroot()
tool_path = root.get( 'tool_path', None )
if tool_path is None:
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/data_manager_util.py
--- a/lib/tool_shed/util/data_manager_util.py
+++ b/lib/tool_shed/util/data_manager_util.py
@@ -1,6 +1,6 @@
import logging
import os
-from galaxy import util
+from tool_shed.util import xml_util
import tool_shed.util.shed_util_common as suc
log = logging.getLogger( __name__ )
@@ -10,7 +10,7 @@
fh = open( config_filename, 'wb' )
fh.write( '<?xml version="1.0"?>\n<data_managers>\n' )#% ( shed_tool_conf_filename ))
for elem in config_elems:
- fh.write( util.xml_to_string( elem, pretty=True ) )
+ fh.write( xml_util.xml_to_string( elem ) )
fh.write( '</data_managers>\n' )
fh.close()
@@ -21,7 +21,7 @@
for tool_tup in repository_tools_tups:
repository_tools_by_guid[ tool_tup[ 1 ] ] = dict( tool_config_filename=tool_tup[ 0 ], tool=tool_tup[ 2 ] )
# Load existing data managers.
- config_elems = [ elem for elem in util.parse_xml( shed_data_manager_conf_filename ).getroot() ]
+ config_elems = [ elem for elem in xml_util.parse_xml( shed_data_manager_conf_filename ).getroot() ]
repo_data_manager_conf_filename = metadata_dict['data_manager'].get( 'config_filename', None )
if repo_data_manager_conf_filename is None:
log.debug( "No data_manager_conf.xml file has been defined." )
@@ -29,13 +29,13 @@
data_manager_config_has_changes = False
relative_repo_data_manager_dir = os.path.join( shed_config_dict.get( 'tool_path', '' ), relative_install_dir )
repo_data_manager_conf_filename = os.path.join( relative_repo_data_manager_dir, repo_data_manager_conf_filename )
- tree = util.parse_xml( repo_data_manager_conf_filename )
+ tree = xml_util.parse_xml( repo_data_manager_conf_filename )
root = tree.getroot()
for elem in root:
if elem.tag == 'data_manager':
data_manager_id = elem.get( 'id', None )
if data_manager_id is None:
- log.error( "A data manager was defined that does not have an id and will not be installed:\n%s" % ( util.xml_to_string( elem ) ) )
+ log.error( "A data manager was defined that does not have an id and will not be installed:\n%s" % ( xml_util.xml_to_string( elem ) ) )
continue
data_manager_dict = metadata_dict['data_manager'].get( 'data_managers', {} ).get( data_manager_id, None )
if data_manager_dict is None:
@@ -77,7 +77,7 @@
if data_manager:
rval.append( data_manager )
else:
- log.warning( "Encountered unexpected element '%s':\n%s" % ( elem.tag, util.xml_to_string( elem ) ) )
+ log.warning( "Encountered unexpected element '%s':\n%s" % ( elem.tag, xml_util.xml_to_string( elem ) ) )
config_elems.append( elem )
data_manager_config_has_changes = True
# Persist the altered shed_data_manager_config file.
@@ -89,7 +89,7 @@
metadata_dict = repository.metadata
if metadata_dict and 'data_manager' in metadata_dict:
shed_data_manager_conf_filename = app.config.shed_data_manager_config_file
- tree = util.parse_xml( shed_data_manager_conf_filename )
+ tree = xml_util.parse_xml( shed_data_manager_conf_filename )
root = tree.getroot()
assert root.tag == 'data_managers', 'The file provided (%s) for removing data managers from is not a valid data manager xml file.' % ( shed_data_manager_conf_filename )
guids = [ data_manager_dict.get( 'guid' ) for data_manager_dict in metadata_dict.get( 'data_manager', {} ).get( 'data_managers', {} ).itervalues() if 'guid' in data_manager_dict ]
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/datatype_util.py
--- a/lib/tool_shed/util/datatype_util.py
+++ b/lib/tool_shed/util/datatype_util.py
@@ -2,7 +2,7 @@
import os
import tempfile
from galaxy import eggs
-from galaxy import util
+from tool_shed.util import xml_util
import tool_shed.util.shed_util_common as suc
import pkg_resources
@@ -24,7 +24,7 @@
has been initialized, the registry's contents cannot be overridden by conflicting data types.
"""
try:
- tree = util.parse_xml( datatypes_config )
+ tree = xml_util.parse_xml( datatypes_config )
except Exception, e:
log.debug( "Error parsing %s, exception: %s" % ( datatypes_config, str( e ) ) )
return None, None
@@ -83,9 +83,9 @@
fd, proprietary_datatypes_config = tempfile.mkstemp()
os.write( fd, '<?xml version="1.0"?>\n' )
os.write( fd, '<datatypes>\n' )
- os.write( fd, '%s' % util.xml_to_string( registration ) )
+ os.write( fd, '%s' % xml_util.xml_to_string( registration ) )
if sniffers:
- os.write( fd, '%s' % util.xml_to_string( sniffers ) )
+ os.write( fd, '%s' % xml_util.xml_to_string( sniffers ) )
os.write( fd, '</datatypes>\n' )
os.close( fd )
os.chmod( proprietary_datatypes_config, 0644 )
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/metadata_util.py
--- a/lib/tool_shed/util/metadata_util.py
+++ b/lib/tool_shed/util/metadata_util.py
@@ -17,6 +17,7 @@
from tool_shed.util import readme_util
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
+from tool_shed.util import xml_util
import pkg_resources
@@ -363,7 +364,7 @@
'error_messages': [] }
metadata_dict[ 'data_manager' ] = data_manager_metadata
try:
- tree = util.parse_xml( data_manager_config_filename )
+ tree = xml_util.parse_xml( data_manager_config_filename )
except Exception, e:
# We are not able to load any data managers.
error_message = 'There was an error parsing your Data Manager config file "%s": %s' % ( data_manager_config_filename, e )
@@ -638,7 +639,7 @@
or checkers.check_bz2( full_path )[ 0 ] or checkers.check_zip( full_path ) ):
try:
# Make sure we're looking at a tool config and not a display application config or something else.
- element_tree = util.parse_xml( full_path )
+ element_tree = xml_util.parse_xml( full_path )
element_tree_root = element_tree.getroot()
is_tool = element_tree_root.tag == 'tool'
except Exception, e:
@@ -751,7 +752,7 @@
error_message = ''
try:
# Make sure we're looking at a valid repository_dependencies.xml file.
- tree = util.parse_xml( repository_dependencies_config )
+ tree = xml_util.parse_xml( repository_dependencies_config )
root = tree.getroot()
xml_is_valid = root.tag == 'repositories'
except Exception, e:
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/shed_util_common.py
--- a/lib/tool_shed/util/shed_util_common.py
+++ b/lib/tool_shed/util/shed_util_common.py
@@ -14,7 +14,7 @@
from galaxy.model.orm import and_
import sqlalchemy.orm.exc
from tool_shed.util import common_util
-from xml.etree import ElementTree as XmlET
+from tool_shed.util import xml_util
from galaxy import eggs
import pkg_resources
@@ -37,17 +37,6 @@
MAX_DISPLAY_SIZE = 32768
VALID_CHARS = set( string.letters + string.digits + "'\"-=_.()/+*^,:?!#[]%\\$@;{}&<>" )
-
-class CommentedTreeBuilder ( XmlET.XMLTreeBuilder ):
- def __init__ ( self, html=0, target=None ):
- XmlET.XMLTreeBuilder.__init__( self, html, target )
- self._parser.CommentHandler = self.handle_comment
-
- def handle_comment ( self, data ):
- self._target.start( XmlET.Comment, {} )
- self._target.data( data )
- self._target.end( XmlET.Comment )
-
new_repo_email_alert_template = """
Sharable link: ${sharable_link}
Repository name: ${repository_name}
@@ -188,7 +177,7 @@
os.write( fd, '<?xml version="1.0"?>\n' )
os.write( fd, '<toolbox tool_path="%s">\n' % str( tool_path ) )
for elem in config_elems:
- os.write( fd, '%s' % util.xml_to_string( elem, pretty=True ) )
+ os.write( fd, '%s' % xml_util.xml_to_string( elem ) )
os.write( fd, '</toolbox>\n' )
os.close( fd )
shutil.move( filename, os.path.abspath( config_filename ) )
@@ -346,7 +335,7 @@
file_name = strip_path( tool_config )
guids_and_configs[ guid ] = file_name
# Parse the shed_tool_conf file in which all of this repository's tools are defined and generate the tool_panel_dict.
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
if elem.tag == 'tool':
@@ -1084,13 +1073,6 @@
prior_installation_required = util.asbool( str( prior_installation_required ) )
return tool_shed, name, owner, changeset_revision, prior_installation_required
-def parse_xml( fname ):
- """Returns a parsed xml tree with comments in tact."""
- fobj = open( fname, 'r' )
- tree = XmlET.parse( fobj, parser=CommentedTreeBuilder() )
- fobj.close()
- return tree
-
def pretty_print( dict=None ):
if dict:
return json.to_json_string( dict, sort_keys=True, indent=4 * ' ' )
@@ -1271,7 +1253,7 @@
for tool_config_filename, guid, tool in repository_tools_tups:
guid_to_tool_elem_dict[ guid ] = generate_tool_elem( tool_shed, repository.name, repository.changeset_revision, repository.owner or '', tool_config_filename, tool, None )
config_elems = []
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
if elem.tag == 'section':
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/tool_util.py
--- a/lib/tool_shed/util/tool_util.py
+++ b/lib/tool_shed/util/tool_util.py
@@ -12,6 +12,7 @@
from galaxy.tools.parameters import dynamic_options
from galaxy.tools.search import ToolBoxSearch
from galaxy.web.form_builder import SelectField
+from tool_shed.util import xml_util
import tool_shed.util.shed_util_common as suc
import pkg_resources
@@ -36,7 +37,7 @@
shed_tool_conf = shed_tool_conf_dict[ 'config_filename' ]
tool_path = shed_tool_conf_dict[ 'tool_path' ]
config_elems = []
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
config_elems.append( elem )
@@ -803,7 +804,7 @@
message = ''
tmp_tool_config = suc.get_named_tmpfile_from_ctx( ctx, ctx_file, work_dir )
if tmp_tool_config:
- element_tree = util.parse_xml( tmp_tool_config )
+ element_tree = xml_util.parse_xml( tmp_tool_config )
element_tree_root = element_tree.getroot()
# Look for code files required by the tool config.
tmp_code_files = []
@@ -845,7 +846,7 @@
shed_tool_conf = shed_tool_conf_dict[ 'config_filename' ]
tool_path = shed_tool_conf_dict[ 'tool_path' ]
config_elems = []
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
config_elems.append( elem )
diff -r 3370974992f38c9f1de1527982adb362ba644fd9 -r ad8d4f2080c062f7c01bfc6f2f86dd0f5431f106 lib/tool_shed/util/xml_util.py
--- /dev/null
+++ b/lib/tool_shed/util/xml_util.py
@@ -0,0 +1,43 @@
+import logging
+import os
+import tempfile
+from xml.etree import ElementTree as XmlET
+import xml.etree.ElementTree
+
+log = logging.getLogger( __name__ )
+
+
+class CommentedTreeBuilder ( XmlET.XMLTreeBuilder ):
+ def __init__ ( self, html=0, target=None ):
+ XmlET.XMLTreeBuilder.__init__( self, html, target )
+ self._parser.CommentHandler = self.handle_comment
+
+ def handle_comment ( self, data ):
+ self._target.start( XmlET.Comment, {} )
+ self._target.data( data )
+ self._target.end( XmlET.Comment )
+
+def create_and_write_tmp_file( elem ):
+ tmp_str = xml_to_string( elem )
+ fh = tempfile.NamedTemporaryFile( 'wb' )
+ tmp_filename = fh.name
+ fh.close()
+ fh = open( tmp_filename, 'wb' )
+ fh.write( '<?xml version="1.0"?>\n' )
+ fh.write( tmp_str )
+ fh.close()
+ return tmp_filename
+
+def parse_xml( file_name ):
+ """Returns a parsed xml tree with comments intact."""
+ try:
+ fobj = open( file_name, 'r' )
+ tree = XmlET.parse( fobj, parser=CommentedTreeBuilder() )
+ fobj.close()
+ except Exception, e:
+ log.exception( "Exception attempting to parse %s: %s" % ( str( file_name ), str( e ) ) )
+ return None
+ return tree
+
+def xml_to_string( elem, encoding='utf-8' ):
+ return '%s\n' % xml.etree.ElementTree.tostring( elem, encoding=encoding )
https://bitbucket.org/galaxy/galaxy-central/commits/7ee9bfda98e5/
Changeset: 7ee9bfda98e5
User: greg
Date: 2013-05-23 20:58:00
Summary: Merged from next-stable
Affected #: 12 files
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -14,6 +14,7 @@
from tool_shed.util import repository_dependency_util
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
+from tool_shed.util import xml_util
from galaxy import eggs
eggs.require( 'mercurial' )
@@ -131,7 +132,7 @@
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
@@ -140,7 +141,7 @@
# are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
@@ -268,13 +269,13 @@
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
if ok:
repo_path = os.path.join( full_path, relative_path )
@@ -330,13 +331,13 @@
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
+ tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
return commit_util.handle_directory_changes( trans,
repository,
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/galaxy_install/install_manager.py
--- a/lib/tool_shed/galaxy_install/install_manager.py
+++ b/lib/tool_shed/galaxy_install/install_manager.py
@@ -15,6 +15,7 @@
from tool_shed.util import metadata_util
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
+from tool_shed.util import xml_util
from galaxy.util.odict import odict
@@ -37,13 +38,13 @@
self.proprietary_tool_confs = self.non_shed_tool_panel_configs
self.proprietary_tool_panel_elems = self.get_proprietary_tool_panel_elems( latest_migration_script_number )
# Set the location where the repositories will be installed by retrieving the tool_path setting from migrated_tools_config.
- tree = util.parse_xml( migrated_tools_config )
+ tree = xml_util.parse_xml( migrated_tools_config )
root = tree.getroot()
self.tool_path = root.get( 'tool_path' )
print "Repositories will be installed into configured tool_path location ", str( self.tool_path )
# Parse tool_shed_install_config to check each of the tools.
self.tool_shed_install_config = tool_shed_install_config
- tree = util.parse_xml( tool_shed_install_config )
+ tree = xml_util.parse_xml( tool_shed_install_config )
root = tree.getroot()
self.tool_shed = suc.clean_tool_shed_url( root.get( 'name' ) )
self.repository_owner = common_util.REPOSITORY_OWNER
@@ -107,7 +108,7 @@
tools_xml_file_path = os.path.abspath( os.path.join( 'scripts', 'migrate_tools', '%04d_tools.xml' % latest_tool_migration_script_number ) )
# Parse the XML and load the file attributes for later checking against the integrated elements from self.proprietary_tool_confs.
migrated_tool_configs = []
- tree = util.parse_xml( tools_xml_file_path )
+ tree = xml_util.parse_xml( tools_xml_file_path )
root = tree.getroot()
for elem in root:
if elem.tag == 'repository':
@@ -116,7 +117,7 @@
# Parse each file in self.proprietary_tool_confs and generate the integrated list of tool panel Elements that contain them.
tool_panel_elems = []
for proprietary_tool_conf in self.proprietary_tool_confs:
- tree = util.parse_xml( proprietary_tool_conf )
+ tree = xml_util.parse_xml( proprietary_tool_conf )
root = tree.getroot()
for elem in root:
if elem.tag == 'tool':
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -10,6 +10,7 @@
import tool_shed.util.common_util as cu
from tool_shed.util import encoding_util
from tool_shed.util import tool_dependency_util
+from tool_shed.util import xml_util
from galaxy.model.orm import and_
from galaxy.web import url_for
@@ -478,7 +479,7 @@
tool_dependency = None
action_dict = {}
if tool_dependencies_config:
- required_td_tree = parse_xml( tool_dependencies_config )
+ required_td_tree = xml_util.parse_xml( tool_dependencies_config )
if required_td_tree:
required_td_root = required_td_tree.getroot()
for required_td_elem in required_td_root:
@@ -623,17 +624,6 @@
file_name = fpath
return file_name
-def parse_xml( file_name ):
- """Returns a parsed xml tree."""
- try:
- tree = ElementTree.parse( file_name )
- except Exception, e:
- print "Exception attempting to parse ", file_name, ": ", str( e )
- return None
- root = tree.getroot()
- ElementInclude.include( root )
- return tree
-
def url_join( *args ):
parts = []
for arg in args:
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/tool_shed_registry.py
--- a/lib/tool_shed/tool_shed_registry.py
+++ b/lib/tool_shed/tool_shed_registry.py
@@ -1,9 +1,8 @@
import logging
import sys
import urllib2
-from galaxy.util import parse_xml
from galaxy.util.odict import odict
-from xml.etree import ElementTree
+from tool_shed.util import xml_util
log = logging.getLogger( __name__ )
@@ -15,7 +14,7 @@
self.tool_sheds_auth = odict()
if root_dir and config:
# Parse tool_sheds_conf.xml
- tree = parse_xml( config )
+ tree = xml_util.parse_xml( config )
root = tree.getroot()
log.debug( 'Loading references to tool sheds from %s' % config )
for elem in root.findall( 'tool_shed' ):
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/commit_util.py
--- a/lib/tool_shed/util/commit_util.py
+++ b/lib/tool_shed/util/commit_util.py
@@ -9,7 +9,7 @@
from galaxy.web import url_for
import tool_shed.util.shed_util_common as suc
from tool_shed.util import tool_util
-import xml.etree.ElementTree
+from tool_shed.util import xml_util
from galaxy import eggs
eggs.require( 'mercurial' )
@@ -60,17 +60,6 @@
message = 'The file "%s" contains image content.\n' % str( file_path )
return message
-def create_and_write_tmp_file( root ):
- tmp_str = '%s\n' % xml.etree.ElementTree.tostring( root, encoding='utf-8' )
- fh = tempfile.NamedTemporaryFile( 'wb' )
- tmp_filename = fh.name
- fh.close()
- fh = open( tmp_filename, 'wb' )
- fh.write( '<?xml version="1.0"?>\n' )
- fh.write( tmp_str )
- fh.close()
- return tmp_filename
-
def get_upload_point( repository, **kwd ):
upload_point = kwd.get( 'upload_point', None )
if upload_point is not None:
@@ -198,7 +187,7 @@
altered = False
try:
# Make sure we're looking at a valid repository_dependencies.xml file.
- tree = suc.parse_xml( repository_dependencies_config )
+ tree = xml_util.parse_xml( repository_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_repository_dependencies_definition: " % str( repository_dependencies_config )
@@ -245,7 +234,7 @@
altered = False
try:
# Make sure we're looking at a valid tool_dependencies.xml file.
- tree = suc.parse_xml( tool_dependencies_config )
+ tree = xml_util.parse_xml( tool_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_tool_dependencies_definition: " % str( tool_dependencies_config )
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/common_util.py
--- a/lib/tool_shed/util/common_util.py
+++ b/lib/tool_shed/util/common_util.py
@@ -1,8 +1,8 @@
import os
import urllib2
-from galaxy import util
from galaxy.util.odict import odict
from tool_shed.util import encoding_util
+from tool_shed.util import xml_util
REPOSITORY_OWNER = 'devteam'
@@ -11,7 +11,7 @@
tools_xml_file_path = os.path.abspath( os.path.join( 'scripts', 'migrate_tools', '%04d_tools.xml' % latest_tool_migration_script_number ) )
# Parse the XML and load the file attributes for later checking against the proprietary tool_panel_config.
migrated_tool_configs_dict = odict()
- tree = util.parse_xml( tools_xml_file_path )
+ tree = xml_util.parse_xml( tools_xml_file_path )
root = tree.getroot()
tool_shed = root.get( 'name' )
tool_shed_url = get_tool_shed_url_from_tools_xml_file_path( app, tool_shed )
@@ -48,7 +48,7 @@
# Parse the proprietary tool_panel_configs (the default is tool_conf.xml) and generate the list of missing tool config file names.
missing_tool_configs_dict = odict()
for tool_panel_config in tool_panel_configs:
- tree = util.parse_xml( tool_panel_config )
+ tree = xml_util.parse_xml( tool_panel_config )
root = tree.getroot()
for elem in root:
if elem.tag == 'tool':
@@ -78,7 +78,7 @@
for config_filename in app.config.tool_configs:
# Any config file that includes a tool_path attribute in the root tag set like the following is shed-related.
# <toolbox tool_path="../shed_tools">
- tree = util.parse_xml( config_filename )
+ tree = xml_util.parse_xml( config_filename )
root = tree.getroot()
tool_path = root.get( 'tool_path', None )
if tool_path is None:
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/data_manager_util.py
--- a/lib/tool_shed/util/data_manager_util.py
+++ b/lib/tool_shed/util/data_manager_util.py
@@ -1,6 +1,6 @@
import logging
import os
-from galaxy import util
+from tool_shed.util import xml_util
import tool_shed.util.shed_util_common as suc
log = logging.getLogger( __name__ )
@@ -10,7 +10,7 @@
fh = open( config_filename, 'wb' )
fh.write( '<?xml version="1.0"?>\n<data_managers>\n' )#% ( shed_tool_conf_filename ))
for elem in config_elems:
- fh.write( util.xml_to_string( elem, pretty=True ) )
+ fh.write( xml_util.xml_to_string( elem ) )
fh.write( '</data_managers>\n' )
fh.close()
@@ -21,7 +21,7 @@
for tool_tup in repository_tools_tups:
repository_tools_by_guid[ tool_tup[ 1 ] ] = dict( tool_config_filename=tool_tup[ 0 ], tool=tool_tup[ 2 ] )
# Load existing data managers.
- config_elems = [ elem for elem in util.parse_xml( shed_data_manager_conf_filename ).getroot() ]
+ config_elems = [ elem for elem in xml_util.parse_xml( shed_data_manager_conf_filename ).getroot() ]
repo_data_manager_conf_filename = metadata_dict['data_manager'].get( 'config_filename', None )
if repo_data_manager_conf_filename is None:
log.debug( "No data_manager_conf.xml file has been defined." )
@@ -29,13 +29,13 @@
data_manager_config_has_changes = False
relative_repo_data_manager_dir = os.path.join( shed_config_dict.get( 'tool_path', '' ), relative_install_dir )
repo_data_manager_conf_filename = os.path.join( relative_repo_data_manager_dir, repo_data_manager_conf_filename )
- tree = util.parse_xml( repo_data_manager_conf_filename )
+ tree = xml_util.parse_xml( repo_data_manager_conf_filename )
root = tree.getroot()
for elem in root:
if elem.tag == 'data_manager':
data_manager_id = elem.get( 'id', None )
if data_manager_id is None:
- log.error( "A data manager was defined that does not have an id and will not be installed:\n%s" % ( util.xml_to_string( elem ) ) )
+ log.error( "A data manager was defined that does not have an id and will not be installed:\n%s" % ( xml_util.xml_to_string( elem ) ) )
continue
data_manager_dict = metadata_dict['data_manager'].get( 'data_managers', {} ).get( data_manager_id, None )
if data_manager_dict is None:
@@ -77,7 +77,7 @@
if data_manager:
rval.append( data_manager )
else:
- log.warning( "Encountered unexpected element '%s':\n%s" % ( elem.tag, util.xml_to_string( elem ) ) )
+ log.warning( "Encountered unexpected element '%s':\n%s" % ( elem.tag, xml_util.xml_to_string( elem ) ) )
config_elems.append( elem )
data_manager_config_has_changes = True
# Persist the altered shed_data_manager_config file.
@@ -89,7 +89,7 @@
metadata_dict = repository.metadata
if metadata_dict and 'data_manager' in metadata_dict:
shed_data_manager_conf_filename = app.config.shed_data_manager_config_file
- tree = util.parse_xml( shed_data_manager_conf_filename )
+ tree = xml_util.parse_xml( shed_data_manager_conf_filename )
root = tree.getroot()
assert root.tag == 'data_managers', 'The file provided (%s) for removing data managers from is not a valid data manager xml file.' % ( shed_data_manager_conf_filename )
guids = [ data_manager_dict.get( 'guid' ) for data_manager_dict in metadata_dict.get( 'data_manager', {} ).get( 'data_managers', {} ).itervalues() if 'guid' in data_manager_dict ]
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/datatype_util.py
--- a/lib/tool_shed/util/datatype_util.py
+++ b/lib/tool_shed/util/datatype_util.py
@@ -2,7 +2,7 @@
import os
import tempfile
from galaxy import eggs
-from galaxy import util
+from tool_shed.util import xml_util
import tool_shed.util.shed_util_common as suc
import pkg_resources
@@ -24,7 +24,7 @@
has been initialized, the registry's contents cannot be overridden by conflicting data types.
"""
try:
- tree = util.parse_xml( datatypes_config )
+ tree = xml_util.parse_xml( datatypes_config )
except Exception, e:
log.debug( "Error parsing %s, exception: %s" % ( datatypes_config, str( e ) ) )
return None, None
@@ -83,9 +83,9 @@
fd, proprietary_datatypes_config = tempfile.mkstemp()
os.write( fd, '<?xml version="1.0"?>\n' )
os.write( fd, '<datatypes>\n' )
- os.write( fd, '%s' % util.xml_to_string( registration ) )
+ os.write( fd, '%s' % xml_util.xml_to_string( registration ) )
if sniffers:
- os.write( fd, '%s' % util.xml_to_string( sniffers ) )
+ os.write( fd, '%s' % xml_util.xml_to_string( sniffers ) )
os.write( fd, '</datatypes>\n' )
os.close( fd )
os.chmod( proprietary_datatypes_config, 0644 )
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/metadata_util.py
--- a/lib/tool_shed/util/metadata_util.py
+++ b/lib/tool_shed/util/metadata_util.py
@@ -17,6 +17,7 @@
from tool_shed.util import readme_util
from tool_shed.util import tool_dependency_util
from tool_shed.util import tool_util
+from tool_shed.util import xml_util
import pkg_resources
@@ -363,7 +364,7 @@
'error_messages': [] }
metadata_dict[ 'data_manager' ] = data_manager_metadata
try:
- tree = util.parse_xml( data_manager_config_filename )
+ tree = xml_util.parse_xml( data_manager_config_filename )
except Exception, e:
# We are not able to load any data managers.
error_message = 'There was an error parsing your Data Manager config file "%s": %s' % ( data_manager_config_filename, e )
@@ -638,7 +639,7 @@
or checkers.check_bz2( full_path )[ 0 ] or checkers.check_zip( full_path ) ):
try:
# Make sure we're looking at a tool config and not a display application config or something else.
- element_tree = util.parse_xml( full_path )
+ element_tree = xml_util.parse_xml( full_path )
element_tree_root = element_tree.getroot()
is_tool = element_tree_root.tag == 'tool'
except Exception, e:
@@ -751,7 +752,7 @@
error_message = ''
try:
# Make sure we're looking at a valid repository_dependencies.xml file.
- tree = util.parse_xml( repository_dependencies_config )
+ tree = xml_util.parse_xml( repository_dependencies_config )
root = tree.getroot()
xml_is_valid = root.tag == 'repositories'
except Exception, e:
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/shed_util_common.py
--- a/lib/tool_shed/util/shed_util_common.py
+++ b/lib/tool_shed/util/shed_util_common.py
@@ -14,7 +14,7 @@
from galaxy.model.orm import and_
import sqlalchemy.orm.exc
from tool_shed.util import common_util
-from xml.etree import ElementTree as XmlET
+from tool_shed.util import xml_util
from galaxy import eggs
import pkg_resources
@@ -37,17 +37,6 @@
MAX_DISPLAY_SIZE = 32768
VALID_CHARS = set( string.letters + string.digits + "'\"-=_.()/+*^,:?!#[]%\\$@;{}&<>" )
-
-class CommentedTreeBuilder ( XmlET.XMLTreeBuilder ):
- def __init__ ( self, html=0, target=None ):
- XmlET.XMLTreeBuilder.__init__( self, html, target )
- self._parser.CommentHandler = self.handle_comment
-
- def handle_comment ( self, data ):
- self._target.start( XmlET.Comment, {} )
- self._target.data( data )
- self._target.end( XmlET.Comment )
-
new_repo_email_alert_template = """
Sharable link: ${sharable_link}
Repository name: ${repository_name}
@@ -188,7 +177,7 @@
os.write( fd, '<?xml version="1.0"?>\n' )
os.write( fd, '<toolbox tool_path="%s">\n' % str( tool_path ) )
for elem in config_elems:
- os.write( fd, '%s' % util.xml_to_string( elem, pretty=True ) )
+ os.write( fd, '%s' % xml_util.xml_to_string( elem ) )
os.write( fd, '</toolbox>\n' )
os.close( fd )
shutil.move( filename, os.path.abspath( config_filename ) )
@@ -346,7 +335,7 @@
file_name = strip_path( tool_config )
guids_and_configs[ guid ] = file_name
# Parse the shed_tool_conf file in which all of this repository's tools are defined and generate the tool_panel_dict.
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
if elem.tag == 'tool':
@@ -1084,13 +1073,6 @@
prior_installation_required = util.asbool( str( prior_installation_required ) )
return tool_shed, name, owner, changeset_revision, prior_installation_required
-def parse_xml( fname ):
- """Returns a parsed xml tree with comments in tact."""
- fobj = open( fname, 'r' )
- tree = XmlET.parse( fobj, parser=CommentedTreeBuilder() )
- fobj.close()
- return tree
-
def pretty_print( dict=None ):
if dict:
return json.to_json_string( dict, sort_keys=True, indent=4 * ' ' )
@@ -1271,7 +1253,7 @@
for tool_config_filename, guid, tool in repository_tools_tups:
guid_to_tool_elem_dict[ guid ] = generate_tool_elem( tool_shed, repository.name, repository.changeset_revision, repository.owner or '', tool_config_filename, tool, None )
config_elems = []
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
if elem.tag == 'section':
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/tool_util.py
--- a/lib/tool_shed/util/tool_util.py
+++ b/lib/tool_shed/util/tool_util.py
@@ -12,6 +12,7 @@
from galaxy.tools.parameters import dynamic_options
from galaxy.tools.search import ToolBoxSearch
from galaxy.web.form_builder import SelectField
+from tool_shed.util import xml_util
import tool_shed.util.shed_util_common as suc
import pkg_resources
@@ -36,7 +37,7 @@
shed_tool_conf = shed_tool_conf_dict[ 'config_filename' ]
tool_path = shed_tool_conf_dict[ 'tool_path' ]
config_elems = []
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
config_elems.append( elem )
@@ -803,7 +804,7 @@
message = ''
tmp_tool_config = suc.get_named_tmpfile_from_ctx( ctx, ctx_file, work_dir )
if tmp_tool_config:
- element_tree = util.parse_xml( tmp_tool_config )
+ element_tree = xml_util.parse_xml( tmp_tool_config )
element_tree_root = element_tree.getroot()
# Look for code files required by the tool config.
tmp_code_files = []
@@ -845,7 +846,7 @@
shed_tool_conf = shed_tool_conf_dict[ 'config_filename' ]
tool_path = shed_tool_conf_dict[ 'tool_path' ]
config_elems = []
- tree = util.parse_xml( shed_tool_conf )
+ tree = xml_util.parse_xml( shed_tool_conf )
root = tree.getroot()
for elem in root:
config_elems.append( elem )
diff -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c -r 7ee9bfda98e5bb8b0ee348309037ad2ba309cef6 lib/tool_shed/util/xml_util.py
--- /dev/null
+++ b/lib/tool_shed/util/xml_util.py
@@ -0,0 +1,43 @@
+import logging
+import os
+import tempfile
+from xml.etree import ElementTree as XmlET
+import xml.etree.ElementTree
+
+log = logging.getLogger( __name__ )
+
+
+class CommentedTreeBuilder ( XmlET.XMLTreeBuilder ):
+ def __init__ ( self, html=0, target=None ):
+ XmlET.XMLTreeBuilder.__init__( self, html, target )
+ self._parser.CommentHandler = self.handle_comment
+
+ def handle_comment ( self, data ):
+ self._target.start( XmlET.Comment, {} )
+ self._target.data( data )
+ self._target.end( XmlET.Comment )
+
+def create_and_write_tmp_file( elem ):
+ tmp_str = xml_to_string( elem )
+ fh = tempfile.NamedTemporaryFile( 'wb' )
+ tmp_filename = fh.name
+ fh.close()
+ fh = open( tmp_filename, 'wb' )
+ fh.write( '<?xml version="1.0"?>\n' )
+ fh.write( tmp_str )
+ fh.close()
+ return tmp_filename
+
+def parse_xml( file_name ):
+ """Returns a parsed xml tree with comments intact."""
+ try:
+ fobj = open( file_name, 'r' )
+ tree = XmlET.parse( fobj, parser=CommentedTreeBuilder() )
+ fobj.close()
+ except Exception, e:
+ log.exception( "Exception attempting to parse %s: %s" % ( str( file_name ), str( e ) ) )
+ return None
+ return tree
+
+def xml_to_string( elem, encoding='utf-8' ):
+ return '%s\n' % xml.etree.ElementTree.tostring( elem, encoding=encoding )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/3370974992f3/
Changeset: 3370974992f3
Branch: next-stable
User: carlfeberhard
Date: 2013-05-23 19:47:03
Summary: Controllers.dataset: add exception string to log msg in _delete
Affected #: 1 file
diff -r 0acfbcd4235b989f81c580c434471f1e657f298a -r 3370974992f38c9f1de1527982adb362ba644fd9 lib/galaxy/webapps/galaxy/controllers/dataset.py
--- a/lib/galaxy/webapps/galaxy/controllers/dataset.py
+++ b/lib/galaxy/webapps/galaxy/controllers/dataset.py
@@ -826,7 +826,7 @@
trans.sa_session.flush()
except Exception, e:
msg = 'HDA deletion failed (encoded: %s, decoded: %s)' % ( dataset_id, id )
- log.exception( msg )
+ log.exception( msg + ': ' + str( e ) )
trans.log_event( msg )
message = 'Dataset deletion failed'
status = 'error'
https://bitbucket.org/galaxy/galaxy-central/commits/ff4cf806bf1e/
Changeset: ff4cf806bf1e
User: carlfeberhard
Date: 2013-05-23 19:50:10
Summary: merge next-stable
Affected #: 1 file
diff -r 1e006b5275f5d3fce99f98b3e6cb00454d5e3c3d -r ff4cf806bf1e6e3bfcd29a539b842f3398a1455c lib/galaxy/webapps/galaxy/controllers/dataset.py
--- a/lib/galaxy/webapps/galaxy/controllers/dataset.py
+++ b/lib/galaxy/webapps/galaxy/controllers/dataset.py
@@ -826,7 +826,7 @@
trans.sa_session.flush()
except Exception, e:
msg = 'HDA deletion failed (encoded: %s, decoded: %s)' % ( dataset_id, id )
- log.exception( msg )
+ log.exception( msg + ': ' + str( e ) )
trans.log_event( msg )
message = 'Dataset deletion failed'
status = 'error'
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/0acfbcd4235b/
Changeset: 0acfbcd4235b
Branch: next-stable
User: greg
Date: 2013-05-23 17:49:52
Summary: Keep all contents of dependency definition files when re-writing them to include missing toolshed and changeset_revision attributes.
Affected #: 3 files
diff -r 966fbc93c483970092b677f17886e86060941093 -r 0acfbcd4235b989f81c580c434471f1e657f298a lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -129,18 +129,18 @@
# Move some version of the uploaded file to the load_point within the repository hierarchy.
if uploaded_file_filename in [ 'repository_dependencies.xml' ]:
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
elif uploaded_file_filename in [ 'tool_dependencies.xml' ]:
# Inspect the contents of the file to see if it defines a complex repository dependency definition whose changeset_revision values
# are missing and if so, set them appropriately.
- altered, root = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
@@ -266,15 +266,15 @@
uploaded_file_name = os.path.abspath( os.path.join( root, uploaded_file ) )
if os.path.split( uploaded_file_name )[ -1 ] == 'repository_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
if ok:
repo_path = os.path.join( full_path, relative_path )
@@ -328,15 +328,15 @@
uploaded_file_name = os.path.join( full_path, filename )
if os.path.split( uploaded_file_name )[ -1 ] == 'repository_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
return commit_util.handle_directory_changes( trans,
repository,
diff -r 966fbc93c483970092b677f17886e86060941093 -r 0acfbcd4235b989f81c580c434471f1e657f298a lib/tool_shed/util/commit_util.py
--- a/lib/tool_shed/util/commit_util.py
+++ b/lib/tool_shed/util/commit_util.py
@@ -9,7 +9,7 @@
from galaxy.web import url_for
import tool_shed.util.shed_util_common as suc
from tool_shed.util import tool_util
-
+import xml.etree.ElementTree
from galaxy import eggs
eggs.require( 'mercurial' )
@@ -17,9 +17,6 @@
from mercurial import hg
from mercurial import ui
-pkg_resources.require( 'elementtree' )
-from elementtree.ElementTree import tostring
-
log = logging.getLogger( __name__ )
UNDESIRABLE_DIRS = [ '.hg', '.svn', '.git', '.cvs' ]
@@ -64,12 +61,13 @@
return message
def create_and_write_tmp_file( root ):
+ tmp_str = '%s\n' % xml.etree.ElementTree.tostring( root, encoding='utf-8' )
fh = tempfile.NamedTemporaryFile( 'wb' )
tmp_filename = fh.name
fh.close()
fh = open( tmp_filename, 'wb' )
fh.write( '<?xml version="1.0"?>\n' )
- fh.write( tostring( root, 'utf-8' ) )
+ fh.write( tmp_str )
fh.close()
return tmp_filename
@@ -200,7 +198,7 @@
altered = False
try:
# Make sure we're looking at a valid repository_dependencies.xml file.
- tree = util.parse_xml( repository_dependencies_config )
+ tree = suc.parse_xml( repository_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_repository_dependencies_definition: " % str( repository_dependencies_config )
@@ -247,7 +245,7 @@
altered = False
try:
# Make sure we're looking at a valid tool_dependencies.xml file.
- tree = util.parse_xml( tool_dependencies_config )
+ tree = suc.parse_xml( tool_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_tool_dependencies_definition: " % str( tool_dependencies_config )
diff -r 966fbc93c483970092b677f17886e86060941093 -r 0acfbcd4235b989f81c580c434471f1e657f298a lib/tool_shed/util/shed_util_common.py
--- a/lib/tool_shed/util/shed_util_common.py
+++ b/lib/tool_shed/util/shed_util_common.py
@@ -14,6 +14,7 @@
from galaxy.model.orm import and_
import sqlalchemy.orm.exc
from tool_shed.util import common_util
+from xml.etree import ElementTree as XmlET
from galaxy import eggs
import pkg_resources
@@ -36,6 +37,17 @@
MAX_DISPLAY_SIZE = 32768
VALID_CHARS = set( string.letters + string.digits + "'\"-=_.()/+*^,:?!#[]%\\$@;{}&<>" )
+
+class CommentedTreeBuilder ( XmlET.XMLTreeBuilder ):
+ def __init__ ( self, html=0, target=None ):
+ XmlET.XMLTreeBuilder.__init__( self, html, target )
+ self._parser.CommentHandler = self.handle_comment
+
+ def handle_comment ( self, data ):
+ self._target.start( XmlET.Comment, {} )
+ self._target.data( data )
+ self._target.end( XmlET.Comment )
+
new_repo_email_alert_template = """
Sharable link: ${sharable_link}
Repository name: ${repository_name}
@@ -1072,6 +1084,13 @@
prior_installation_required = util.asbool( str( prior_installation_required ) )
return tool_shed, name, owner, changeset_revision, prior_installation_required
+def parse_xml( fname ):
+ """Returns a parsed xml tree with comments in tact."""
+ fobj = open( fname, 'r' )
+ tree = XmlET.parse( fobj, parser=CommentedTreeBuilder() )
+ fobj.close()
+ return tree
+
def pretty_print( dict=None ):
if dict:
return json.to_json_string( dict, sort_keys=True, indent=4 * ' ' )
https://bitbucket.org/galaxy/galaxy-central/commits/1e006b5275f5/
Changeset: 1e006b5275f5
User: greg
Date: 2013-05-23 17:50:16
Summary: Merged from next-stable
Affected #: 3 files
diff -r 1e601ee269763ebc51a42b757f5bd6d93b9f8fc4 -r 1e006b5275f5d3fce99f98b3e6cb00454d5e3c3d lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -129,18 +129,18 @@
# Move some version of the uploaded file to the load_point within the repository hierarchy.
if uploaded_file_filename in [ 'repository_dependencies.xml' ]:
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
elif uploaded_file_filename in [ 'tool_dependencies.xml' ]:
# Inspect the contents of the file to see if it defines a complex repository dependency definition whose changeset_revision values
# are missing and if so, set them appropriately.
- altered, root = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
shutil.move( uploaded_file_name, full_path )
@@ -266,15 +266,15 @@
uploaded_file_name = os.path.abspath( os.path.join( root, uploaded_file ) )
if os.path.split( uploaded_file_name )[ -1 ] == 'repository_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
if ok:
repo_path = os.path.join( full_path, relative_path )
@@ -328,15 +328,15 @@
uploaded_file_name = os.path.join( full_path, filename )
if os.path.split( uploaded_file_name )[ -1 ] == 'repository_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == 'tool_dependencies.xml':
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
+ altered, root_elem = commit_util.handle_tool_dependencies_definition( trans, uploaded_file_name )
if altered:
- tmp_filename = commit_util.create_and_write_tmp_file( root )
+ tmp_filename = commit_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
return commit_util.handle_directory_changes( trans,
repository,
diff -r 1e601ee269763ebc51a42b757f5bd6d93b9f8fc4 -r 1e006b5275f5d3fce99f98b3e6cb00454d5e3c3d lib/tool_shed/util/commit_util.py
--- a/lib/tool_shed/util/commit_util.py
+++ b/lib/tool_shed/util/commit_util.py
@@ -9,7 +9,7 @@
from galaxy.web import url_for
import tool_shed.util.shed_util_common as suc
from tool_shed.util import tool_util
-
+import xml.etree.ElementTree
from galaxy import eggs
eggs.require( 'mercurial' )
@@ -17,9 +17,6 @@
from mercurial import hg
from mercurial import ui
-pkg_resources.require( 'elementtree' )
-from elementtree.ElementTree import tostring
-
log = logging.getLogger( __name__ )
UNDESIRABLE_DIRS = [ '.hg', '.svn', '.git', '.cvs' ]
@@ -64,12 +61,13 @@
return message
def create_and_write_tmp_file( root ):
+ tmp_str = '%s\n' % xml.etree.ElementTree.tostring( root, encoding='utf-8' )
fh = tempfile.NamedTemporaryFile( 'wb' )
tmp_filename = fh.name
fh.close()
fh = open( tmp_filename, 'wb' )
fh.write( '<?xml version="1.0"?>\n' )
- fh.write( tostring( root, 'utf-8' ) )
+ fh.write( tmp_str )
fh.close()
return tmp_filename
@@ -200,7 +198,7 @@
altered = False
try:
# Make sure we're looking at a valid repository_dependencies.xml file.
- tree = util.parse_xml( repository_dependencies_config )
+ tree = suc.parse_xml( repository_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_repository_dependencies_definition: " % str( repository_dependencies_config )
@@ -247,7 +245,7 @@
altered = False
try:
# Make sure we're looking at a valid tool_dependencies.xml file.
- tree = util.parse_xml( tool_dependencies_config )
+ tree = suc.parse_xml( tool_dependencies_config )
root = tree.getroot()
except Exception, e:
error_message = "Error parsing %s in handle_tool_dependencies_definition: " % str( tool_dependencies_config )
diff -r 1e601ee269763ebc51a42b757f5bd6d93b9f8fc4 -r 1e006b5275f5d3fce99f98b3e6cb00454d5e3c3d lib/tool_shed/util/shed_util_common.py
--- a/lib/tool_shed/util/shed_util_common.py
+++ b/lib/tool_shed/util/shed_util_common.py
@@ -14,6 +14,7 @@
from galaxy.model.orm import and_
import sqlalchemy.orm.exc
from tool_shed.util import common_util
+from xml.etree import ElementTree as XmlET
from galaxy import eggs
import pkg_resources
@@ -36,6 +37,17 @@
MAX_DISPLAY_SIZE = 32768
VALID_CHARS = set( string.letters + string.digits + "'\"-=_.()/+*^,:?!#[]%\\$@;{}&<>" )
+
+class CommentedTreeBuilder ( XmlET.XMLTreeBuilder ):
+ def __init__ ( self, html=0, target=None ):
+ XmlET.XMLTreeBuilder.__init__( self, html, target )
+ self._parser.CommentHandler = self.handle_comment
+
+ def handle_comment ( self, data ):
+ self._target.start( XmlET.Comment, {} )
+ self._target.data( data )
+ self._target.end( XmlET.Comment )
+
new_repo_email_alert_template = """
Sharable link: ${sharable_link}
Repository name: ${repository_name}
@@ -1072,6 +1084,13 @@
prior_installation_required = util.asbool( str( prior_installation_required ) )
return tool_shed, name, owner, changeset_revision, prior_installation_required
+def parse_xml( fname ):
+ """Returns a parsed xml tree with comments in tact."""
+ fobj = open( fname, 'r' )
+ tree = XmlET.parse( fobj, parser=CommentedTreeBuilder() )
+ fobj.close()
+ return tree
+
def pretty_print( dict=None ):
if dict:
return json.to_json_string( dict, sort_keys=True, indent=4 * ' ' )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Dave Bouvier: Merged next-stable into default.
by commits-noreply@bitbucket.org 23 May '13
by commits-noreply@bitbucket.org 23 May '13
23 May '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/1e601ee26976/
Changeset: 1e601ee26976
User: Dave Bouvier
Date: 2013-05-23 17:48:03
Summary: Merged next-stable into default.
Affected #: 0 files
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Fix for TabularToolDataTable.get_column_name_list() when value column is overloaded by e.g. name.
by commits-noreply@bitbucket.org 23 May '13
by commits-noreply@bitbucket.org 23 May '13
23 May '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/966fbc93c483/
Changeset: 966fbc93c483
Branch: next-stable
User: dan
Date: 2013-05-23 17:11:37
Summary: Fix for TabularToolDataTable.get_column_name_list() when value column is overloaded by e.g. name.
Affected #: 1 file
diff -r 064ea784eae74cdb7b4ac0319ec0aa13ee55fcbd -r 966fbc93c483970092b677f17886e86060941093 lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -297,9 +297,12 @@
found_column = False
for name, index in self.columns.iteritems():
if index == i:
- rval.append( name )
+ if not found_column:
+ rval.append( name )
+ elif name == 'value':
+ #the column named 'value' always has priority over other named columns
+ rval[ -1 ] = name
found_column = True
- break
if not found_column:
rval.append( None )
return rval
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0