galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
October 2013
- 1 participants
- 226 discussions
25 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/654dfe820960/
Changeset: 654dfe820960
Branch: search
User: dannon
Date: 2013-10-25 21:32:18
Summary: Close search branch.
Affected #: 0 files
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/776ca4a63a0b/
Changeset: 776ca4a63a0b
Branch: search
User: Kyle Ellrott
Date: 2013-10-25 00:34:17
Summary: Adding better path parsing so tool API can take tool id with slashes (for example tools that are named after tool shed repos)
Affected #: 1 file
diff -r 35a847f0f33d2944b8033aefeb8d5ef0c734c8c6 -r 776ca4a63a0b2ec568582bea3c2bb43cc24675ee lib/galaxy/webapps/galaxy/buildapp.py
--- a/lib/galaxy/webapps/galaxy/buildapp.py
+++ b/lib/galaxy/webapps/galaxy/buildapp.py
@@ -155,6 +155,7 @@
webapp.mapper.resource( 'role', 'roles', path_prefix='/api' )
webapp.mapper.resource( 'group', 'groups', path_prefix='/api' )
webapp.mapper.resource_with_deleted( 'quota', 'quotas', path_prefix='/api' )
+ webapp.mapper.connect( '/api/tools/{id:.*?}', action='show', controller="tools" )
webapp.mapper.resource( 'tool', 'tools', path_prefix='/api' )
webapp.mapper.resource_with_deleted( 'user', 'users', path_prefix='/api' )
webapp.mapper.resource( 'genome', 'genomes', path_prefix='/api' )
https://bitbucket.org/galaxy/galaxy-central/commits/9de9f6e5ded2/
Changeset: 9de9f6e5ded2
User: dannon
Date: 2013-10-25 21:31:34
Summary: Merged in kellrott/galaxy-central/search (pull request #242)
Fix routing to /api/tool/<id> when id has slashes in name
Affected #: 1 file
diff -r a27073465c2f4925012e1df12e19321aed2b9f4a -r 9de9f6e5ded2a9ac5ace423564e76a6b20f694d4 lib/galaxy/webapps/galaxy/buildapp.py
--- a/lib/galaxy/webapps/galaxy/buildapp.py
+++ b/lib/galaxy/webapps/galaxy/buildapp.py
@@ -155,6 +155,7 @@
webapp.mapper.resource( 'role', 'roles', path_prefix='/api' )
webapp.mapper.resource( 'group', 'groups', path_prefix='/api' )
webapp.mapper.resource_with_deleted( 'quota', 'quotas', path_prefix='/api' )
+ webapp.mapper.connect( '/api/tools/{id:.*?}', action='show', controller="tools" )
webapp.mapper.resource( 'tool', 'tools', path_prefix='/api' )
webapp.mapper.resource_with_deleted( 'user', 'users', path_prefix='/api' )
webapp.mapper.resource( 'genome', 'genomes', path_prefix='/api' )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
4 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/0255f2f29a2b/
Changeset: 0255f2f29a2b
Branch: next-stable
User: Dave Bouvier
Date: 2013-10-25 20:39:44
Summary: Check for matching in-memory sniffer before appending. Remove sniffer when deactivating or uninstalling a repository. Additional checks for keys existing before deleting them.
Affected #: 1 file
diff -r 8109823f43c30a7efa744471b657efbc9624b218 -r 0255f2f29a2ba9fd047c89be919453ea90ab731f lib/galaxy/datatypes/registry.py
--- a/lib/galaxy/datatypes/registry.py
+++ b/lib/galaxy/datatypes/registry.py
@@ -217,7 +217,8 @@
if sniffers:
for elem in sniffers.findall( 'sniffer' ):
# Keep an in-memory list of sniffer elems to enable persistence.
- self.sniffer_elems.append( elem )
+ if elem not in self.sniffer_elems:
+ self.sniffer_elems.append( elem )
dtype = elem.get( 'type', None )
if dtype:
try:
@@ -239,6 +240,8 @@
module = getattr( module, comp )
aclass = getattr( module, datatype_class_name )()
if deactivate:
+ if elem in self.sniffer_elems:
+ self.sniffer_elems.remove( elem )
for sniffer_class in self.sniff_order:
if sniffer_class.__class__ == aclass.__class__:
self.sniff_order.remove( sniffer_class )
@@ -463,10 +466,10 @@
target_datatype = elem[2]
if installed_repository_dict:
converter_path = installed_repository_dict[ 'converter_path' ]
+ else:
+ converter_path = self.converters_path
+ try:
config_path = os.path.join( converter_path, tool_config )
- else:
- config_path = os.path.join( self.converters_path, tool_config )
- try:
converter = toolbox.load_tool( config_path )
if installed_repository_dict:
# If the converter is included in an installed tool shed repository, set the tool
@@ -484,7 +487,8 @@
converter.id = tool_dict[ 'guid' ]
break
if deactivate:
- del toolbox.tools_by_id[ converter.id ]
+ if converter.id in toolbox.tools_by_id:
+ del toolbox.tools_by_id[ converter.id ]
if source_datatype in self.datatype_converters:
del self.datatype_converters[ source_datatype ][ target_datatype ]
self.log.debug( "Deactivated converter: %s", converter.id )
@@ -496,9 +500,9 @@
self.log.debug( "Loaded converter: %s", converter.id )
except Exception, e:
if deactivate:
- self.log.exception( "Error deactivating converter (%s): %s" % ( config_path, str( e ) ) )
+ self.log.exception( "Error deactivating converter from (%s): %s" % ( converter_path, str( e ) ) )
else:
- self.log.exception( "Error loading converter (%s): %s" % ( config_path, str( e ) ) )
+ self.log.exception( "Error loading converter (%s): %s" % ( converter_path, str( e ) ) )
def load_display_applications( self, installed_repository_dict=None, deactivate=False ):
"""
If deactivate is False, add display applications from self.display_app_containers or
https://bitbucket.org/galaxy/galaxy-central/commits/0101573c8701/
Changeset: 0101573c8701
User: Dave Bouvier
Date: 2013-10-25 20:42:50
Summary: Merge fix from next-stable.
Affected #: 1 file
diff -r c432e846d53d1506dff86d914bb4c513ffaf19d5 -r 0101573c8701fc94bc848c2a785c48d4f3e484c3 lib/galaxy/datatypes/registry.py
--- a/lib/galaxy/datatypes/registry.py
+++ b/lib/galaxy/datatypes/registry.py
@@ -217,7 +217,8 @@
if sniffers:
for elem in sniffers.findall( 'sniffer' ):
# Keep an in-memory list of sniffer elems to enable persistence.
- self.sniffer_elems.append( elem )
+ if elem not in self.sniffer_elems:
+ self.sniffer_elems.append( elem )
dtype = elem.get( 'type', None )
if dtype:
try:
@@ -239,6 +240,8 @@
module = getattr( module, comp )
aclass = getattr( module, datatype_class_name )()
if deactivate:
+ if elem in self.sniffer_elems:
+ self.sniffer_elems.remove( elem )
for sniffer_class in self.sniff_order:
if sniffer_class.__class__ == aclass.__class__:
self.sniff_order.remove( sniffer_class )
@@ -463,10 +466,10 @@
target_datatype = elem[2]
if installed_repository_dict:
converter_path = installed_repository_dict[ 'converter_path' ]
+ else:
+ converter_path = self.converters_path
+ try:
config_path = os.path.join( converter_path, tool_config )
- else:
- config_path = os.path.join( self.converters_path, tool_config )
- try:
converter = toolbox.load_tool( config_path )
if installed_repository_dict:
# If the converter is included in an installed tool shed repository, set the tool
@@ -484,7 +487,8 @@
converter.id = tool_dict[ 'guid' ]
break
if deactivate:
- del toolbox.tools_by_id[ converter.id ]
+ if converter.id in toolbox.tools_by_id:
+ del toolbox.tools_by_id[ converter.id ]
if source_datatype in self.datatype_converters:
del self.datatype_converters[ source_datatype ][ target_datatype ]
self.log.debug( "Deactivated converter: %s", converter.id )
@@ -496,9 +500,9 @@
self.log.debug( "Loaded converter: %s", converter.id )
except Exception, e:
if deactivate:
- self.log.exception( "Error deactivating converter (%s): %s" % ( config_path, str( e ) ) )
+ self.log.exception( "Error deactivating converter from (%s): %s" % ( converter_path, str( e ) ) )
else:
- self.log.exception( "Error loading converter (%s): %s" % ( config_path, str( e ) ) )
+ self.log.exception( "Error loading converter (%s): %s" % ( converter_path, str( e ) ) )
def load_display_applications( self, installed_repository_dict=None, deactivate=False ):
"""
If deactivate is False, add display applications from self.display_app_containers or
https://bitbucket.org/galaxy/galaxy-central/commits/870a79ce1a10/
Changeset: 870a79ce1a10
Branch: next-stable
User: Dave Bouvier
Date: 2013-10-25 20:45:29
Summary: Correctly handle extracting a tarball that does contain all files and directories within a single directory at the root of the archive, but does not contain an entry for the directory itself.
Affected #: 1 file
diff -r 0255f2f29a2ba9fd047c89be919453ea90ab731f -r 870a79ce1a1048ec941b4ea298b3422ade4fc9e6 lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
@@ -43,31 +43,15 @@
os.makedirs( extraction_path )
self.archive.extractall( extraction_path )
else:
- # Sort filenames within the archive by length, because if the shortest path entry in the archive is a directory,
- # and the next entry has the directory prepended, then everything following that entry must be within that directory.
- # For example, consider tarball for BWA 0.5.9:
- # bwa-0.5.9.tar.bz2:
- # bwa-0.5.9/
- # bwa-0.5.9/bwt.c
- # bwa-0.5.9/bwt.h
- # bwa-0.5.9/Makefile
- # bwa-0.5.9/bwt_gen/
- # bwa-0.5.9/bwt_gen/Makefile
- # bwa-0.5.9/bwt_gen/bwt_gen.c
- # bwa-0.5.9/bwt_gen/bwt_gen.h
- # When sorted by length, one sees a directory at the root of the tarball, and all other tarball contents as
- # children of that directory.
- filenames = sorted( [ self.getname( item ) for item in contents ], cmp=lambda a,b: cmp( len( a ), len( b ) ) )
- parent_name = filenames[ 0 ]
- parent = self.getmember( parent_name )
- first_child = filenames[ 1 ]
- if first_child.startswith( parent_name ) and parent is not None and self.isdir( parent ):
- if self.getname( parent ) == self.file_name:
- self.archive.extractall( os.path.join( path ) )
- extraction_path = os.path.join( path, self.file_name )
- else:
- self.archive.extractall( os.path.join( path ) )
- extraction_path = os.path.join( path, self.getname( parent ) )
+ # Get the common prefix for all the files in the archive. If the common prefix ends with a slash,
+ # or self.isdir() returns True, the archive contains a single directory with the desired contents.
+ # Otherwise, it contains multiple files and/or directories at the root of the archive.
+ common_prefix = os.path.commonprefix( [ self.getname( item ) for item in contents ] )
+ if len( common_prefix ) >= 1 and not common_prefix.endswith( os.sep ) and self.isdir( self.getmember( common_prefix ) ):
+ common_prefix += os.sep
+ if common_prefix.endswith( os.sep ):
+ self.archive.extractall( os.path.join( path ) )
+ extraction_path = os.path.join( path, common_prefix )
else:
extraction_path = os.path.join( path, self.file_name )
if not os.path.exists( extraction_path ):
https://bitbucket.org/galaxy/galaxy-central/commits/a27073465c2f/
Changeset: a27073465c2f
User: Dave Bouvier
Date: 2013-10-25 20:46:48
Summary: Merge fix from next-stable.
Affected #: 1 file
diff -r 0101573c8701fc94bc848c2a785c48d4f3e484c3 -r a27073465c2f4925012e1df12e19321aed2b9f4a lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/td_common_util.py
@@ -43,31 +43,15 @@
os.makedirs( extraction_path )
self.archive.extractall( extraction_path )
else:
- # Sort filenames within the archive by length, because if the shortest path entry in the archive is a directory,
- # and the next entry has the directory prepended, then everything following that entry must be within that directory.
- # For example, consider tarball for BWA 0.5.9:
- # bwa-0.5.9.tar.bz2:
- # bwa-0.5.9/
- # bwa-0.5.9/bwt.c
- # bwa-0.5.9/bwt.h
- # bwa-0.5.9/Makefile
- # bwa-0.5.9/bwt_gen/
- # bwa-0.5.9/bwt_gen/Makefile
- # bwa-0.5.9/bwt_gen/bwt_gen.c
- # bwa-0.5.9/bwt_gen/bwt_gen.h
- # When sorted by length, one sees a directory at the root of the tarball, and all other tarball contents as
- # children of that directory.
- filenames = sorted( [ self.getname( item ) for item in contents ], cmp=lambda a,b: cmp( len( a ), len( b ) ) )
- parent_name = filenames[ 0 ]
- parent = self.getmember( parent_name )
- first_child = filenames[ 1 ]
- if first_child.startswith( parent_name ) and parent is not None and self.isdir( parent ):
- if self.getname( parent ) == self.file_name:
- self.archive.extractall( os.path.join( path ) )
- extraction_path = os.path.join( path, self.file_name )
- else:
- self.archive.extractall( os.path.join( path ) )
- extraction_path = os.path.join( path, self.getname( parent ) )
+ # Get the common prefix for all the files in the archive. If the common prefix ends with a slash,
+ # or self.isdir() returns True, the archive contains a single directory with the desired contents.
+ # Otherwise, it contains multiple files and/or directories at the root of the archive.
+ common_prefix = os.path.commonprefix( [ self.getname( item ) for item in contents ] )
+ if len( common_prefix ) >= 1 and not common_prefix.endswith( os.sep ) and self.isdir( self.getmember( common_prefix ) ):
+ common_prefix += os.sep
+ if common_prefix.endswith( os.sep ):
+ self.archive.extractall( os.path.join( path ) )
+ extraction_path = os.path.join( path, common_prefix )
else:
extraction_path = os.path.join( path, self.file_name )
if not os.path.exists( extraction_path ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/8109823f43c3/
Changeset: 8109823f43c3
Branch: next-stable
User: dan
Date: 2013-10-25 17:51:37
Summary: Fix for Javascript requiring setting a user/public name during registration
Affected #: 1 file
diff -r 94aea2327373d4cb651e3db3fe5113a5b0c669f5 -r 8109823f43c30a7efa744471b657efbc9624b218 templates/user/register.mako
--- a/templates/user/register.mako
+++ b/templates/user/register.mako
@@ -75,7 +75,7 @@
var error_text_password_short = 'Please use a password of at least 6 characters';
var error_text_password_match = "Passwords don't match";
- var validForm = true;
+ var validForm = true;
var email = $('#email_input').val();
var name = $('#name_input').val()
@@ -83,7 +83,7 @@
else if (!validateString(email,"email")){ renderError(error_text_email); validForm = false;}
else if (!($('#password_input').val() === $('#password_check_input').val())){ renderError(error_text_password_match); validForm = false;}
else if ($('#password_input').val().length < 6 ){ renderError(error_text_password_short); validForm = false;}
- else if (!(validateString(name,"username"))){ renderError(error_text_username_characters); validForm = false;}
+ else if (name && !(validateString(name,"username"))){ renderError(error_text_username_characters); validForm = false;}
if (!validForm) {
e.preventDefault();
https://bitbucket.org/galaxy/galaxy-central/commits/c432e846d53d/
Changeset: c432e846d53d
User: dan
Date: 2013-10-25 17:52:03
Summary: merge next-stable
Affected #: 3 files
diff -r 6865031ea0c473984473e57100f60a3739065ca1 -r c432e846d53d1506dff86d914bb4c513ffaf19d5 templates/user/register.mako
--- a/templates/user/register.mako
+++ b/templates/user/register.mako
@@ -75,7 +75,7 @@
var error_text_password_short = 'Please use a password of at least 6 characters';
var error_text_password_match = "Passwords don't match";
- var validForm = true;
+ var validForm = true;
var email = $('#email_input').val();
var name = $('#name_input').val()
@@ -83,7 +83,7 @@
else if (!validateString(email,"email")){ renderError(error_text_email); validForm = false;}
else if (!($('#password_input').val() === $('#password_check_input').val())){ renderError(error_text_password_match); validForm = false;}
else if ($('#password_input').val().length < 6 ){ renderError(error_text_password_short); validForm = false;}
- else if (!(validateString(name,"username"))){ renderError(error_text_username_characters); validForm = false;}
+ else if (name && !(validateString(name,"username"))){ renderError(error_text_username_characters); validForm = false;}
if (!validForm) {
e.preventDefault();
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Display an error message rather than raising an exception when a changeset being committed to a tool shed repository includes a dependency definition that has an invalid <repository> tag set.
by commits-noreply@bitbucket.org 25 Oct '13
by commits-noreply@bitbucket.org 25 Oct '13
25 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/6865031ea0c4/
Changeset: 6865031ea0c4
User: greg
Date: 2013-10-25 17:06:28
Summary: Display an error message rather than raising an exception when a changeset being committed to a tool shed repository includes a dependency definition that has an invalid <repository> tag set.
Affected #: 3 files
diff -r d57971b60b8d907c33bf4dfc1bba2432baaf96e9 -r 6865031ea0c473984473e57100f60a3739065ca1 lib/galaxy/webapps/tool_shed/controllers/upload.py
--- a/lib/galaxy/webapps/tool_shed/controllers/upload.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/upload.py
@@ -135,8 +135,14 @@
# Move some version of the uploaded file to the load_point within the repository hierarchy.
if uploaded_file_filename in [ suc.REPOSITORY_DEPENDENCY_DEFINITION_FILENAME ]:
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name, unpopulate=False )
- if altered:
+ altered, root_elem, error_message = commit_util.handle_repository_dependencies_definition( trans,
+ uploaded_file_name,
+ unpopulate=False )
+ if error_message:
+ ok = False
+ message = error_message
+ status = 'error'
+ elif altered:
tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, full_path )
else:
@@ -152,26 +158,31 @@
shutil.move( uploaded_file_name, full_path )
else:
shutil.move( uploaded_file_name, full_path )
- # See if any admin users have chosen to receive email alerts when a repository is updated. If so, check every uploaded file to ensure
- # content is appropriate.
- check_contents = commit_util.check_file_contents_for_email_alerts( trans )
- if check_contents and os.path.isfile( full_path ):
- content_alert_str = commit_util.check_file_content_for_html_and_images( full_path )
- else:
- content_alert_str = ''
- commands.add( repo.ui, repo, full_path )
- # Convert from unicode to prevent "TypeError: array item must be char"
- full_path = full_path.encode( 'ascii', 'replace' )
- commands.commit( repo.ui, repo, full_path, user=trans.user.username, message=commit_message )
- if full_path.endswith( 'tool_data_table_conf.xml.sample' ):
- # Handle the special case where a tool_data_table_conf.xml.sample file is being uploaded by parsing the file and adding new entries
- # to the in-memory trans.app.tool_data_tables dictionary.
- error, error_message = tool_util.handle_sample_tool_data_table_conf_file( trans.app, full_path )
- if error:
- message = '%s<br/>%s' % ( message, error_message )
- # See if the content of the change set was valid.
- admin_only = len( repository.downloadable_revisions ) != 1
- suc.handle_email_alerts( trans, repository, content_alert_str=content_alert_str, new_repo_alert=new_repo_alert, admin_only=admin_only )
+ if ok:
+ # See if any admin users have chosen to receive email alerts when a repository is updated. If so, check every uploaded file to ensure
+ # content is appropriate.
+ check_contents = commit_util.check_file_contents_for_email_alerts( trans )
+ if check_contents and os.path.isfile( full_path ):
+ content_alert_str = commit_util.check_file_content_for_html_and_images( full_path )
+ else:
+ content_alert_str = ''
+ commands.add( repo.ui, repo, full_path )
+ # Convert from unicode to prevent "TypeError: array item must be char"
+ full_path = full_path.encode( 'ascii', 'replace' )
+ commands.commit( repo.ui, repo, full_path, user=trans.user.username, message=commit_message )
+ if full_path.endswith( 'tool_data_table_conf.xml.sample' ):
+ # Handle the special case where a tool_data_table_conf.xml.sample file is being uploaded by parsing the file and adding new entries
+ # to the in-memory trans.app.tool_data_tables dictionary.
+ error, error_message = tool_util.handle_sample_tool_data_table_conf_file( trans.app, full_path )
+ if error:
+ message = '%s<br/>%s' % ( message, error_message )
+ # See if the content of the change set was valid.
+ admin_only = len( repository.downloadable_revisions ) != 1
+ suc.handle_email_alerts( trans,
+ repository,
+ content_alert_str=content_alert_str,
+ new_repo_alert=new_repo_alert,
+ admin_only=admin_only )
if ok:
# Update the repository files for browsing.
suc.update_repository( repo )
@@ -283,8 +294,12 @@
uploaded_file_name = os.path.abspath( os.path.join( root, uploaded_file ) )
if os.path.split( uploaded_file_name )[ -1 ] == suc.REPOSITORY_DEPENDENCY_DEFINITION_FILENAME:
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name, unpopulate=False )
- if altered:
+ altered, root_elem, error_message = commit_util.handle_repository_dependencies_definition( trans,
+ uploaded_file_name,
+ unpopulate=False )
+ if error_message:
+ return False, error_message, [], '', [], []
+ elif altered:
tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == suc.TOOL_DEPENDENCY_DEFINITION_FILENAME:
@@ -344,8 +359,12 @@
uploaded_file_name = os.path.join( full_path, filename )
if os.path.split( uploaded_file_name )[ -1 ] == suc.REPOSITORY_DEPENDENCY_DEFINITION_FILENAME:
# Inspect the contents of the file to see if changeset_revision values are missing and if so, set them appropriately.
- altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, uploaded_file_name, unpopulate=False )
- if altered:
+ altered, root_elem, error_message = commit_util.handle_repository_dependencies_definition( trans,
+ uploaded_file_name,
+ unpopulate=False )
+ if error_message:
+ return False, error_message, [], '', [], []
+ elif altered:
tmp_filename = xml_util.create_and_write_tmp_file( root_elem )
shutil.move( tmp_filename, uploaded_file_name )
elif os.path.split( uploaded_file_name )[ -1 ] == suc.TOOL_DEPENDENCY_DEFINITION_FILENAME:
diff -r d57971b60b8d907c33bf4dfc1bba2432baaf96e9 -r 6865031ea0c473984473e57100f60a3739065ca1 lib/tool_shed/util/commit_util.py
--- a/lib/tool_shed/util/commit_util.py
+++ b/lib/tool_shed/util/commit_util.py
@@ -257,7 +257,7 @@
# Make sure we're looking at a valid repository_dependencies.xml file.
tree, error_message = xml_util.parse_xml( repository_dependencies_config )
if tree is None:
- return False, None
+ return False, None, error_message
root = tree.getroot()
if root.tag == 'repositories':
for index, elem in enumerate( root ):
@@ -265,14 +265,14 @@
# <repository name="molecule_datatypes" owner="test" changeset_revision="1a070566e9c6" />
revised, elem, error_message = handle_repository_dependency_elem( trans, elem, unpopulate=unpopulate )
if error_message:
- exception_message = 'The repository_dependencies.xml file contains an invalid <repository> tag. %s' % error_message
- raise Exception( exception_message )
+ error_message = 'The repository_dependencies.xml file contains an invalid <repository> tag. %s' % error_message
+ return False, None, error_message
if revised:
root[ index ] = elem
if not altered:
altered = True
- return altered, root
- return False, None
+ return altered, root, error_message
+ return False, None, error_message
def handle_repository_dependency_elem( trans, elem, unpopulate=False ):
# <repository name="molecule_datatypes" owner="test" changeset_revision="1a070566e9c6" />
diff -r d57971b60b8d907c33bf4dfc1bba2432baaf96e9 -r 6865031ea0c473984473e57100f60a3739065ca1 lib/tool_shed/util/export_util.py
--- a/lib/tool_shed/util/export_util.py
+++ b/lib/tool_shed/util/export_util.py
@@ -142,7 +142,9 @@
# See if we have a repository dependencies defined.
if name == suc.REPOSITORY_DEPENDENCY_DEFINITION_FILENAME:
# Eliminate the toolshed, and changeset_revision attributes from all <repository> tags.
- altered, root_elem = commit_util.handle_repository_dependencies_definition( trans, full_path, unpopulate=True )
+ altered, root_elem, error_message = commit_util.handle_repository_dependencies_definition( trans, full_path, unpopulate=True )
+ if error_message:
+ return None, error_message
if altered:
tmp_filename = xml_util.create_and_write_tmp_file( root_elem, use_indent=True )
shutil.move( tmp_filename, full_path )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/ed0e437ce0d6/
Changeset: ed0e437ce0d6
User: dannon
Date: 2013-10-25 06:43:55
Summary: Missing tempfile import in dataproviders/external.
Affected #: 1 file
diff -r 5a259fa8aabb4d977c5ed091d1658b9674d0e67c -r ed0e437ce0d6663ce247161a32287e2de977d226 lib/galaxy/datatypes/dataproviders/external.py
--- a/lib/galaxy/datatypes/dataproviders/external.py
+++ b/lib/galaxy/datatypes/dataproviders/external.py
@@ -3,12 +3,12 @@
or not in a file.
"""
+import base
+import gzip
+import line
import subprocess
+import tempfile
import urllib, urllib2
-import gzip
-
-import base
-import line
_TODO = """
YAGNI: ftp, image, cryptos, sockets
https://bitbucket.org/galaxy/galaxy-central/commits/d57971b60b8d/
Changeset: d57971b60b8d
User: dannon
Date: 2013-10-25 06:57:53
Summary: Bugfix/cleanup in reports/workflows module -- was using workflow_id instead of stored_workflow_id
Affected #: 1 file
diff -r ed0e437ce0d6663ce247161a32287e2de977d226 -r d57971b60b8d907c33bf4dfc1bba2432baaf96e9 lib/galaxy/webapps/reports/controllers/workflows.py
--- a/lib/galaxy/webapps/reports/controllers/workflows.py
+++ b/lib/galaxy/webapps/reports/controllers/workflows.py
@@ -1,34 +1,40 @@
-import calendar, operator, os, socket
-from datetime import datetime, date, timedelta
-from time import mktime, strftime, localtime
+import calendar
+from datetime import date, timedelta
+from galaxy import eggs
+from galaxy import model, util
+from galaxy.model.orm import and_
from galaxy.web.base.controller import BaseUIController, web
-from galaxy import model, util
-from galaxy.web.framework.helpers import time_ago, iff, grids
-from galaxy.model.orm import and_, not_, or_
-import pkg_resources
-pkg_resources.require( "SQLAlchemy >= 0.4" )
+from galaxy.web.framework.helpers import grids
+eggs.require( "SQLAlchemy >= 0.4" )
import sqlalchemy as sa
+
import logging
log = logging.getLogger( __name__ )
+
class SpecifiedDateListGrid( grids.Grid ):
+
class WorkflowNameColumn( grids.TextColumn ):
def get_value( self, trans, grid, stored_workflow ):
return stored_workflow.name
+
class CreateTimeColumn( grids.DateTimeColumn ):
def get_value( self, trans, grid, stored_workflow ):
return stored_workflow.create_time
+
class UserColumn( grids.TextColumn ):
def get_value( self, trans, grid, stored_workflow ):
if stored_workflow.user:
return stored_workflow.user.email
return 'unknown'
+
class EmailColumn( grids.GridColumn ):
def filter( self, trans, user, query, column_filter ):
if column_filter == 'All':
return query
return query.filter( and_( model.StoredWorkflow.table.c.user_id == model.User.table.c.id,
model.User.table.c.email == column_filter ) )
+
class SpecifiedDateColumn( grids.GridColumn ):
def filter( self, trans, user, query, column_filter ):
if column_filter == 'All':
@@ -94,6 +100,7 @@
.join( model.User ) \
.enable_eagerloads( False )
+
class Workflows( BaseUIController ):
specified_date_list_grid = SpecifiedDateListGrid()
@@ -120,7 +127,7 @@
**kwd ) )
elif operation == "user_per_month":
stored_workflow_id = kwd.get( 'id', None )
- workflow = get_workflow( trans, workflow_id )
+ workflow = get_workflow( trans, stored_workflow_id )
if workflow.user:
kwd[ 'email' ] = workflow.user.email
else:
@@ -129,9 +136,9 @@
action='user_per_month',
**kwd ) )
return self.specified_date_list_grid( trans, **kwd )
+
@web.expose
def per_month_all( self, trans, **kwd ):
- params = util.Params( kwd )
message = ''
q = sa.select( ( sa.func.date_trunc( 'month', sa.func.date( model.StoredWorkflow.table.c.create_time ) ).label( 'date' ),sa.func.count( model.StoredWorkflow.table.c.id ).label( 'total_workflows' ) ),
from_obj = [ sa.outerjoin( model.StoredWorkflow.table, model.User.table ) ],
@@ -146,9 +153,9 @@
return trans.fill_template( '/webapps/reports/workflows_per_month_all.mako',
workflows=workflows,
message=message )
+
@web.expose
def per_user( self, trans, **kwd ):
- params = util.Params( kwd )
message = ''
workflows = []
q = sa.select( ( model.User.table.c.email.label( 'user_email' ),
@@ -160,6 +167,7 @@
workflows.append( ( row.user_email,
row.total_workflows ) )
return trans.fill_template( '/webapps/reports/workflows_per_user.mako', workflows=workflows, message=message )
+
@web.expose
def user_per_month( self, trans, **kwd ):
params = util.Params( kwd )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
17 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/79fd1058b25d/
Changeset: 79fd1058b25d
User: dannon
Date: 2013-10-25 00:40:09
Summary: Clean up imports in util package base
Affected #: 1 file
diff -r 97cb7306dc345ce5e81861e421c5277701161802 -r 79fd1058b25dc43a5292843a01106e38dfd83994 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -2,25 +2,30 @@
Utility functions used systemwide.
"""
-import logging, threading, random, string, re, binascii, pickle, time, datetime, math, re, os, sys, tempfile, stat, grp, smtplib, errno, shutil
+import binascii, errno, grp, logging, os, pickle, random, re, shutil, smtplib, stat, string, sys, tempfile, threading
from email.MIMEText import MIMEText
from os.path import relpath
from hashlib import md5
from galaxy import eggs
-import pkg_resources
-pkg_resources.require( 'docutils' )
+eggs.require( 'docutils' )
import docutils.core
import docutils.writers.html4css1
-pkg_resources.require( 'elementtree' )
+eggs.require( 'elementtree' )
from elementtree import ElementTree, ElementInclude
-pkg_resources.require( "wchartype" )
+eggs.require( "wchartype" )
import wchartype
+from inflection import Inflector, English
+inflector = Inflector(English)
+
+eggs.require( "simplejson" )
+import simplejson
+
log = logging.getLogger(__name__)
_lock = threading.RLock()
@@ -35,12 +40,6 @@
NULL_CHAR = '\000'
BINARY_CHARS = [ NULL_CHAR ]
-from inflection import Inflector, English
-inflector = Inflector(English)
-
-pkg_resources.require( "simplejson" )
-import simplejson
-
def is_multi_byte( chars ):
for char in chars:
try:
@@ -946,5 +945,5 @@
return os.path.abspath(galaxy_root_path)
if __name__ == '__main__':
- import doctest, sys
+ import doctest
doctest.testmod(sys.modules[__name__], verbose=False)
https://bitbucket.org/galaxy/galaxy-central/commits/a81f9cdc2035/
Changeset: a81f9cdc2035
User: dannon
Date: 2013-10-25 00:42:18
Summary: Correct errors in util's xml_to_dict
Affected #: 1 file
diff -r 79fd1058b25dc43a5292843a01106e38dfd83994 -r a81f9cdc2035e6c7ea2febeb0a1af7e10d2ef2a3 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -176,9 +176,9 @@
sub_elem_dict[ key ].append( value )
for key, value in sub_elem_dict.iteritems():
if len( value ) == 1:
- rval[ elem.tag ][ k ] = value[0]
+ rval[ elem.tag ][ key ] = value[0]
else:
- rval[ elem.tag ][ k ] = value
+ rval[ elem.tag ][ key ] = value
if elem.attrib:
for key, value in elem.attrib.iteritems():
rval[ elem.tag ][ "@%s" % key ] = value
https://bitbucket.org/galaxy/galaxy-central/commits/8305d16cf154/
Changeset: 8305d16cf154
User: dannon
Date: 2013-10-25 00:43:05
Summary: Strip unused variables from util package
Affected #: 1 file
diff -r a81f9cdc2035e6c7ea2febeb0a1af7e10d2ef2a3 -r 8305d16cf1542166a8d1c2bd04abd8a0562ca438 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -44,7 +44,7 @@
for char in chars:
try:
char = unicode( char )
- except UnicodeDecodeError, e:
+ except UnicodeDecodeError:
# Probably binary
return False
if wchartype.is_asian( char ) or \
@@ -736,7 +736,7 @@
name = names.next()
file = os.path.join(dir, prefix + name)
try:
- linked_path = os.link( src, file )
+ os.link( src, file )
return (os.path.abspath(file))
except OSError, e:
if e.errno == errno.EEXIST:
https://bitbucket.org/galaxy/galaxy-central/commits/02bbb394aa98/
Changeset: 02bbb394aa98
User: dannon
Date: 2013-10-25 00:59:49
Summary: Fix missing HTTPNotImplemented import in api/annotations; cleanup existing.
Affected #: 1 file
diff -r 8305d16cf1542166a8d1c2bd04abd8a0562ca438 -r 02bbb394aa9856c5a607a8e61e7dfea9a7241dfd lib/galaxy/webapps/galaxy/api/annotations.py
--- a/lib/galaxy/webapps/galaxy/api/annotations.py
+++ b/lib/galaxy/webapps/galaxy/api/annotations.py
@@ -1,21 +1,15 @@
"""
API operations on annotations.
"""
-import logging, os, string, shutil, urllib, re, socket
-from cgi import escape, FieldStorage
-from galaxy import util, datatypes, jobs, web, util
-from galaxy.web.base.controller import BaseAPIController, UsesHistoryMixin, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin
+import logging
+from galaxy import web
from galaxy.model.item_attrs import UsesAnnotations
from galaxy.util.sanitize_html import sanitize_html
-import galaxy.datatypes
-from galaxy.util.bunch import Bunch
-
-import pkg_resources
-pkg_resources.require( "Routes" )
-import routes
+from galaxy.web.base.controller import BaseAPIController, HTTPNotImplemented, UsesHistoryDatasetAssociationMixin, UsesHistoryMixin, UsesStoredWorkflowMixin
log = logging.getLogger( __name__ )
+
class BaseAnnotationsController( BaseAPIController, UsesAnnotations, UsesHistoryMixin, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin ):
@web.expose_api
@@ -25,7 +19,6 @@
if item is not None:
return self.get_item_annotation_str( trans.sa_session, trans.get_user(), item )
-
@web.expose_api
def create( self, trans, payload, **kwd ):
if "text" not in payload:
@@ -53,23 +46,29 @@
def undelete( self, trans, **kwd ):
raise HTTPNotImplemented()
+
class HistoryAnnotationsController(BaseAnnotationsController):
controller_name = "history_annotations"
tagged_item_id = "history_id"
+
def _get_item_from_id(self, trans, idstr):
hist = self.get_history( trans, idstr )
return hist
+
class HistoryContentAnnotationsController(BaseAnnotationsController):
controller_name = "history_content_annotations"
tagged_item_id = "history_content_id"
+
def _get_item_from_id(self, trans, idstr):
hda = self.get_dataset(trans, idstr)
return hda
+
class WorkflowAnnotationsController(BaseAnnotationsController):
controller_name = "workflow_annotations"
tagged_item_id = "workflow_id"
+
def _get_item_from_id(self, trans, idstr):
hda = self.get_stored_workflow(trans, idstr)
return hda
https://bitbucket.org/galaxy/galaxy-central/commits/a19245efea88/
Changeset: a19245efea88
User: dannon
Date: 2013-10-25 01:08:04
Summary: Fix broken string_as_bool in history_contents API
Affected #: 1 file
diff -r 02bbb394aa9856c5a607a8e61e7dfea9a7241dfd -r a19245efea8847b994e90602c33813554a50ce03 lib/galaxy/webapps/galaxy/api/history_contents.py
--- a/lib/galaxy/webapps/galaxy/api/history_contents.py
+++ b/lib/galaxy/webapps/galaxy/api/history_contents.py
@@ -2,15 +2,12 @@
API operations on the contents of a history.
"""
-from galaxy import web, util
-from galaxy import exceptions
-from galaxy.web.base.controller import BaseAPIController, url_for
-from galaxy.web.base.controller import UsesHistoryDatasetAssociationMixin, UsesHistoryMixin
-from galaxy.web.base.controller import UsesLibraryMixin, UsesLibraryMixinItems
-from galaxy.datatypes import sniff
+import logging
+from galaxy import exceptions, util, web
+from galaxy.web.base.controller import (BaseAPIController, url_for,
+ UsesHistoryDatasetAssociationMixin, UsesHistoryMixin, UsesLibraryMixin,
+ UsesLibraryMixinItems)
-import os
-import logging
log = logging.getLogger( __name__ )
class HistoryContentsController( BaseAPIController, UsesHistoryDatasetAssociationMixin, UsesHistoryMixin,
@@ -354,7 +351,7 @@
# a request body is optional here
purge = False
if kwd.get( 'payload', None ):
- purge = string_as_bool( kwd['payload'].get( 'purge', False ) )
+ purge = util.string_as_bool( kwd['payload'].get( 'purge', False ) )
rval = { 'id' : id }
try:
@@ -389,7 +386,7 @@
log.exception( 'HDA API, delete: uncaught HTTPInternalServerError: %s, %s\n%s',
id, str( kwd ), str( http_server_err ) )
raise
- except exceptions.httpexceptions.HTTPException, http_exc:
+ except exceptions.httpexceptions.HTTPException:
raise
except Exception, exc:
log.exception( 'HDA API, delete: uncaught exception: %s, %s\n%s',
https://bitbucket.org/galaxy/galaxy-central/commits/28119d3941aa/
Changeset: 28119d3941aa
User: dannon
Date: 2013-10-25 01:10:39
Summary: Use standard spacing in history contents API, strip whitespace.
Affected #: 1 file
diff -r a19245efea8847b994e90602c33813554a50ce03 -r 28119d3941aae98156cdb4e07243ce7e3c4848a3 lib/galaxy/webapps/galaxy/api/history_contents.py
--- a/lib/galaxy/webapps/galaxy/api/history_contents.py
+++ b/lib/galaxy/webapps/galaxy/api/history_contents.py
@@ -10,8 +10,10 @@
log = logging.getLogger( __name__ )
+
class HistoryContentsController( BaseAPIController, UsesHistoryDatasetAssociationMixin, UsesHistoryMixin,
UsesLibraryMixin, UsesLibraryMixinItems ):
+
@web.expose_api_anonymous
def index( self, trans, history_id, ids=None, **kwd ):
"""
@@ -44,11 +46,9 @@
and ( history_id == trans.security.encode_id( trans.history.id ) ) ):
#TODO:?? is secure?
history = trans.history
-
# otherwise, check permissions for the history first
else:
history = self.get_history( trans, history_id, check_ownership=True, check_accessible=True )
-
# if ids, return _FULL_ data (as show) for each id passed
if ids:
ids = ids.split( ',' )
@@ -57,7 +57,6 @@
if encoded_hda_id in ids:
#TODO: share code with show
rval.append( self._detailed_hda_dict( trans, hda ) )
-
# if no ids passed, return a _SUMMARY_ of _all_ datasets in the history
else:
details = kwd.get( 'details', None ) or []
@@ -71,13 +70,11 @@
rval.append( self._detailed_hda_dict( trans, hda ) )
else:
rval.append( self._summary_hda_dict( trans, history_id, hda ) )
-
except Exception, e:
# for errors that are not specific to one hda (history lookup or summary list)
rval = "Error in history API at listing contents: " + str( e )
log.error( rval + ": %s, %s" % ( type( e ), str( e ) ), exc_info=True )
trans.response.status = 500
-
return rval
#TODO: move to model or Mixin
@@ -114,7 +111,6 @@
hda_dict[ 'display_types' ] = self.get_old_display_applications( trans, hda )
hda_dict[ 'display_apps' ] = self.get_display_apps( trans, hda )
return hda_dict
-
except Exception, exc:
# catch error here - returning a briefer hda_dict with an error attribute
log.exception( "Error in history API at listing contents with history %s, hda %s: (%s) %s",
@@ -149,24 +145,20 @@
#TODO: dataset/hda by id (from history) OR check_ownership for anon user
hda = self.get_history_dataset_association( trans, history, id,
check_ownership=False, check_accessible=True )
-
else:
#TODO: do we really need the history?
history = self.get_history( trans, history_id,
check_ownership=True, check_accessible=True, deleted=False )
hda = self.get_history_dataset_association( trans, history, id,
check_ownership=True, check_accessible=True )
-
hda_dict = self.get_hda_dict( trans, hda )
hda_dict[ 'display_types' ] = self.get_old_display_applications( trans, hda )
hda_dict[ 'display_apps' ] = self.get_display_apps( trans, hda )
-
except Exception, e:
msg = "Error in history API at listing dataset: %s" % ( str(e) )
log.error( msg, exc_info=True )
trans.response.status = 500
return msg
-
return hda_dict
@web.expose_api
@@ -183,28 +175,25 @@
copy from library:
'source' = 'library'
'content' = [the encoded id from the library dataset]
-
+
copy from HDA:
'source' = 'hda'
'content' = [the encoded id from the HDA]
..note:
Currently, a user can only copy an HDA from a history that the user owns.
-
+
:rtype: dict
:returns: dictionary containing detailed information for the new HDA
"""
-
#TODO: copy existing, accessible hda - dataset controller, copy_datasets
#TODO: convert existing, accessible hda - model.DatasetInstance(or hda.datatype).get_converter_types
-
# check parameters
source = payload.get('source', None)
content = payload.get('content', None)
if source not in ['library', 'hda'] or content is None:
trans.response.status = 400
return "Please define the source ('library' or 'hda') and the content."
-
# retrieve history
try:
history = self.get_history( trans, history_id, check_ownership=True, check_accessible=False )
@@ -212,10 +201,8 @@
# no way to tell if it failed bc of perms or other (all MessageExceptions)
trans.response.status = 500
return str( e )
-
# copy from library dataset
if source == 'library':
-
# get library data set
try:
ld = self.get_library_dataset( trans, content, check_ownership=False, check_accessible=False )
@@ -226,17 +213,14 @@
return str( e )
except Exception, e:
return str( e )
-
# insert into history
hda = ld.library_dataset_dataset_association.to_history_dataset_association( history, add_to_history=True )
trans.sa_session.flush()
return hda.to_dict()
-
elif source == 'hda':
try:
#NOTE: user currently only capable of copying one of their own datasets
hda = self.get_dataset( trans, content )
-
except ( exceptions.httpexceptions.HTTPRequestRangeNotSatisfiable,
exceptions.httpexceptions.HTTPBadRequest ), id_exc:
# wot...
@@ -250,12 +234,10 @@
trans.response.status = 500
log.exception( "history: %s, source: %s, content: %s", history_id, source, content )
return str( exc )
-
data_copy=hda.copy( copy_children=True )
result=history.add_dataset( data_copy )
trans.sa_session.flush()
return result.to_dict()
-
else:
# other options
trans.response.status = 501
@@ -291,27 +273,22 @@
if history_id != trans.security.encode_id( trans.history.id ):
trans.response.status = 401
return { 'error': 'Anonymous users cannot edit histories other than their current history' }
-
anon_allowed_payload = {}
if 'deleted' in payload:
anon_allowed_payload[ 'deleted' ] = payload[ 'deleted' ]
if 'visible' in payload:
anon_allowed_payload[ 'visible' ] = payload[ 'visible' ]
-
payload = self._validate_and_parse_update_payload( anon_allowed_payload )
hda = self.get_dataset( trans, id, check_ownership=False, check_accessible=False, check_state=True )
if hda.history != trans.history:
trans.response.status = 401
return { 'error': 'Anonymous users cannot edit datasets outside their current history' }
-
else:
payload = self._validate_and_parse_update_payload( payload )
hda = self.get_dataset( trans, id, check_ownership=True, check_accessible=True, check_state=True )
-
# get_dataset can return a string during an error
if hda and isinstance( hda, trans.model.HistoryDatasetAssociation ):
changed = self.set_hda_from_dict( trans, hda, payload )
-
except Exception, exception:
log.error( 'Update of history (%s), HDA (%s) failed: %s',
history_id, id, str( exception ), exc_info=True )
@@ -323,7 +300,6 @@
else:
trans.response.status = 500
return { 'error': str( exception ) }
-
return changed
@web.expose_api
@@ -352,22 +328,18 @@
purge = False
if kwd.get( 'payload', None ):
purge = util.string_as_bool( kwd['payload'].get( 'purge', False ) )
-
rval = { 'id' : id }
try:
hda = self.get_dataset( trans, id,
check_ownership=True, check_accessible=True, check_state=True )
hda.deleted = True
-
if purge:
if not trans.app.config.allow_user_dataset_purge:
raise exceptions.httpexceptions.HTTPForbidden(
detail='This instance does not allow user dataset purging' )
-
hda.purged = True
trans.sa_session.add( hda )
trans.sa_session.flush()
-
if hda.dataset.user_can_purge:
try:
hda.dataset.full_delete()
@@ -376,12 +348,9 @@
pass
# flush now to preserve deleted state in case of later interruption
trans.sa_session.flush()
-
rval[ 'purged' ] = True
-
trans.sa_session.flush()
rval[ 'deleted' ] = True
-
except exceptions.httpexceptions.HTTPInternalServerError, http_server_err:
log.exception( 'HDA API, delete: uncaught HTTPInternalServerError: %s, %s\n%s',
id, str( kwd ), str( http_server_err ) )
@@ -393,7 +362,6 @@
id, str( kwd ), str( exc ) )
trans.response.status = 500
rval.update({ 'error': str( exc ) })
-
return rval
def _validate_and_parse_update_payload( self, payload ):
@@ -416,7 +384,6 @@
'metadata_dbkey', 'metadata_column_names', 'metadata_column_types', 'metadata_columns',
'metadata_comment_lines', 'metadata_data_lines'
)
-
validated_payload = {}
for key, val in payload.items():
# TODO: lots of boilerplate here, but overhead on abstraction is equally onerous
https://bitbucket.org/galaxy/galaxy-central/commits/9fcd7759a210/
Changeset: 9fcd7759a210
User: dannon
Date: 2013-10-25 01:11:56
Summary: Irods objectstore: fix invalid rcCollCreate
Affected #: 1 file
diff -r 28119d3941aae98156cdb4e07243ce7e3c4848a3 -r 9fcd7759a210684ad2762340da7dbeab753c24c0 lib/galaxy/objectstore/rods.py
--- a/lib/galaxy/objectstore/rods.py
+++ b/lib/galaxy/objectstore/rods.py
@@ -6,16 +6,14 @@
import os
import time
-import errno
import logging
-#import traceback
from posixpath import join as path_join
from posixpath import basename as path_basename
from posixpath import dirname as path_dirname
from galaxy.objectstore import DiskObjectStore, ObjectStore, local_extra_dirs
-from galaxy.exceptions import ObjectNotFound, ObjectInvalid
+from galaxy.exceptions import ObjectNotFound
import galaxy.eggs
galaxy.eggs.require( 'PyRods' )
@@ -121,7 +119,7 @@
log.debug( 'Creating collection %s' % collname )
ci = irods.collInp_t()
ci.collName = collname
- status = rcCollCreate( self.rods_conn, ci )
+ status = irods.rcCollCreate( self.rods_conn, ci )
assert status == 0, '__mkcolls(): Failed to create collection: %s' % collname
@local_extra_dirs
https://bitbucket.org/galaxy/galaxy-central/commits/b9e78741f439/
Changeset: b9e78741f439
User: dannon
Date: 2013-10-25 01:25:28
Summary: Import cleanup in toolshed repositories API
Affected #: 1 file
diff -r 9fcd7759a210684ad2762340da7dbeab753c24c0 -r b9e78741f439f06c30b07d7d591c7c83183be6a6 lib/galaxy/webapps/tool_shed/api/repositories.py
--- a/lib/galaxy/webapps/tool_shed/api/repositories.py
+++ b/lib/galaxy/webapps/tool_shed/api/repositories.py
@@ -1,11 +1,12 @@
-import logging
+import logging, os
from time import strftime
-from galaxy.web.framework.helpers import time_ago
+
from galaxy import eggs
+from galaxy import util
from galaxy import web
-from galaxy import util
from galaxy.util import json
from galaxy.web.base.controller import BaseAPIController
+from galaxy.web.framework.helpers import time_ago
import tool_shed.repository_types.util as rt_util
import tool_shed.util.shed_util_common as suc
from tool_shed.galaxy_install import repository_util
@@ -14,9 +15,7 @@
eggs.require( 'mercurial' )
-from mercurial import commands
from mercurial import hg
-from mercurial import ui
log = logging.getLogger( __name__ )
https://bitbucket.org/galaxy/galaxy-central/commits/b154d93fb369/
Changeset: b154d93fb369
User: dannon
Date: 2013-10-25 01:26:06
Summary: Fix broken debug statements in toolshed repositories API -- would have caused Exception in reset_metadata
Affected #: 1 file
diff -r b9e78741f439f06c30b07d7d591c7c83183be6a6 -r b154d93fb369443a5979aa49e59c42fe27e7df53 lib/galaxy/webapps/tool_shed/api/repositories.py
--- a/lib/galaxy/webapps/tool_shed/api/repositories.py
+++ b/lib/galaxy/webapps/tool_shed/api/repositories.py
@@ -278,7 +278,7 @@
encoded_id = trans.security.encode_id( repository.id )
if encoded_id in encoded_ids_to_skip:
log.debug( "Skipping repository with id %s because it is in encoded_ids_to_skip %s" % \
- ( str( repository_id ), str( encoded_ids_to_skip ) ) )
+ ( str( repository.id ), str( encoded_ids_to_skip ) ) )
elif repository.type == rt_util.TOOL_DEPENDENCY_DEFINITION and repository.id not in handled_repository_ids:
results = handle_repository( trans, repository, results )
# Now reset metadata on all remaining repositories.
@@ -286,7 +286,7 @@
encoded_id = trans.security.encode_id( repository.id )
if encoded_id in encoded_ids_to_skip:
log.debug( "Skipping repository with id %s because it is in encoded_ids_to_skip %s" % \
- ( str( repository_id ), str( encoded_ids_to_skip ) ) )
+ ( str( repository.id ), str( encoded_ids_to_skip ) ) )
elif repository.type != rt_util.TOOL_DEPENDENCY_DEFINITION and repository.id not in handled_repository_ids:
results = handle_repository( trans, repository, results )
stop_time = strftime( "%Y-%m-%d %H:%M:%S" )
https://bitbucket.org/galaxy/galaxy-central/commits/9de91cccfcf0/
Changeset: 9de91cccfcf0
User: dannon
Date: 2013-10-25 01:27:15
Summary: Remove unused encoded_repository_id from get_ordered_installable_revisions in toolshed repository API
Affected #: 1 file
diff -r b154d93fb369443a5979aa49e59c42fe27e7df53 -r 9de91cccfcf0b7fc5f1beac6209e0111fd5c4af2 lib/galaxy/webapps/tool_shed/api/repositories.py
--- a/lib/galaxy/webapps/tool_shed/api/repositories.py
+++ b/lib/galaxy/webapps/tool_shed/api/repositories.py
@@ -38,7 +38,6 @@
try:
# Get the repository information.
repository = suc.get_repository_by_name_and_owner( trans.app, name, owner )
- encoded_repository_id = trans.security.encode_id( repository.id )
repo_dir = repository.repo_path( trans.app )
repo = hg.repository( suc.get_configured_ui(), repo_dir )
ordered_installable_revisions = suc.get_ordered_metadata_changeset_revisions( repository, repo, downloadable=True )
https://bitbucket.org/galaxy/galaxy-central/commits/d0a805216378/
Changeset: d0a805216378
User: dannon
Date: 2013-10-25 01:28:02
Summary: Add missing import HTTPBadRequest to toolshed repository revisions API
Affected #: 1 file
diff -r 9de91cccfcf0b7fc5f1beac6209e0111fd5c4af2 -r d0a80521637842c250665010848e7eb749946cce lib/galaxy/webapps/tool_shed/api/repository_revisions.py
--- a/lib/galaxy/webapps/tool_shed/api/repository_revisions.py
+++ b/lib/galaxy/webapps/tool_shed/api/repository_revisions.py
@@ -5,7 +5,7 @@
from galaxy import web
from galaxy import util
from galaxy.model.orm import and_, not_, select
-from galaxy.web.base.controller import BaseAPIController
+from galaxy.web.base.controller import BaseAPIController, HTTPBadRequest
from tool_shed.util import export_util
import tool_shed.util.shed_util_common as suc
https://bitbucket.org/galaxy/galaxy-central/commits/880f16259947/
Changeset: 880f16259947
User: dannon
Date: 2013-10-25 01:29:09
Summary: Remove unused code in toolshed repository revisions API; strip whitespace.
Affected #: 1 file
diff -r d0a80521637842c250665010848e7eb749946cce -r 880f162599472ce9316ccef0e0c56b4d16c11424 lib/galaxy/webapps/tool_shed/api/repository_revisions.py
--- a/lib/galaxy/webapps/tool_shed/api/repository_revisions.py
+++ b/lib/galaxy/webapps/tool_shed/api/repository_revisions.py
@@ -43,7 +43,6 @@
if not changeset_revision:
raise HTTPBadRequest( detail="Missing required parameter 'changeset_revision'." )
export_repository_dependencies = payload.get( 'export_repository_dependencies', False )
- download_dir = payload.get( 'download_dir', '/tmp' )
try:
# We'll currently support only gzip-compressed tar archives.
file_type = 'gz'
@@ -167,7 +166,6 @@
flush_needed = False
for key, new_value in payload.items():
if hasattr( repository_metadata, key ):
- old_value = getattr( repository_metadata, key )
setattr( repository_metadata, key, new_value )
if key in [ 'tools_functionally_correct', 'time_last_tested' ]:
# Automatically update repository_metadata.time_last_tested.
@@ -186,10 +184,10 @@
action='show',
id=trans.security.encode_id( repository_metadata.id ) )
return repository_metadata_dict
-
+
def __get_value_mapper( self, trans, repository_metadata ):
value_mapper = { 'id' : trans.security.encode_id,
'repository_id' : trans.security.encode_id }
if repository_metadata.time_last_tested is not None:
- value_mapper[ 'time_last_tested' ] = time_ago
+ value_mapper[ 'time_last_tested' ] = time_ago
return value_mapper
https://bitbucket.org/galaxy/galaxy-central/commits/3e5cbc50db38/
Changeset: 3e5cbc50db38
User: dannon
Date: 2013-10-25 01:31:10
Summary: Fix bug in galaxy's tool_shed_repositories API -- would cause exception if there was ever a shed_tool_conf.
Affected #: 1 file
diff -r 880f162599472ce9316ccef0e0c56b4d16c11424 -r 3e5cbc50db38f44d54ee0682d4320658ba6d4763 lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
--- a/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
+++ b/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
@@ -267,8 +267,7 @@
if shed_tool_conf:
# Get the tool_path setting.
index, shed_conf_dict = suc.get_shed_tool_conf_dict( trans.app, shed_tool_conf )
- # BUG, FIXME: Shed config dict does not exist in this context
- tool_path = shed_config_dict[ 'tool_path' ]
+ tool_path = shed_conf_dict[ 'tool_path' ]
else:
# Pick a semi-random shed-related tool panel configuration file and get the tool_path setting.
for shed_config_dict in trans.app.toolbox.shed_tool_confs:
https://bitbucket.org/galaxy/galaxy-central/commits/31d16cab8447/
Changeset: 31d16cab8447
User: dannon
Date: 2013-10-25 01:32:30
Summary: Fix yet another missing import in tool_shed_repositories API
Affected #: 1 file
diff -r 3e5cbc50db38f44d54ee0682d4320658ba6d4763 -r 31d16cab8447c111e1de64575208db8d16063e17 lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
--- a/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
+++ b/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
@@ -13,6 +13,8 @@
from tool_shed.util import encoding_util
from tool_shed.util import metadata_util
from tool_shed.util import workflow_util
+from tool_shed.util import tool_util
+
import tool_shed.util.shed_util_common as suc
log = logging.getLogger( __name__ )
https://bitbucket.org/galaxy/galaxy-central/commits/404f0c60e9bd/
Changeset: 404f0c60e9bd
User: dannon
Date: 2013-10-25 01:34:38
Summary: Complete cleanup of tool_shed_repositories API -- remove unused code and strip whitespace
Affected #: 1 file
diff -r 31d16cab8447c111e1de64575208db8d16063e17 -r 404f0c60e9bd45654bef1ea68bc64a26c0811bca lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
--- a/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
+++ b/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
@@ -34,6 +34,7 @@
message += 'Installing_Galaxy_tool_shed_repository_tools_into_a_local_Galaxy_instance.'
return message
+
class ToolShedRepositoriesController( BaseAPIController ):
"""RESTful controller for interactions with tool shed repositories."""
@@ -43,7 +44,7 @@
GET /api/tool_shed_repositories/{encoded_tool_shed_repository_id}/exported_workflows
Display a list of dictionaries containing information about this tool shed repository's exported workflows.
-
+
:param id: the encoded id of the ToolShedRepository object
"""
# Example URL: http://localhost:8763/api/tool_shed_repositories/f2db41e1fa331b3e/exported_…
@@ -74,10 +75,10 @@
POST /api/tool_shed_repositories/import_workflow
Import the specified exported workflow contained in the specified installed tool shed repository into Galaxy.
-
+
:param key: the API key of the Galaxy user with which the imported workflow will be associated.
:param id: the encoded id of the ToolShedRepository object
-
+
The following parameters are included in the payload.
:param index: the index location of the workflow tuple in the list of exported workflows stored in the metadata for the specified repository
"""
@@ -109,7 +110,7 @@
POST /api/tool_shed_repositories/import_workflow
Import all of the exported workflows contained in the specified installed tool shed repository into Galaxy.
-
+
:param key: the API key of the Galaxy user with which the imported workflows will be associated.
:param id: the encoded id of the ToolShedRepository object
"""
@@ -229,7 +230,6 @@
return dict( status='error', error=message )
if raw_text:
items = json.from_json_string( raw_text )
- repository_dict = items[ 0 ]
repository_revision_dict = items[ 1 ]
repo_info_dict = items[ 2 ]
else:
@@ -415,7 +415,6 @@
install_tool_dependencies = payload.get( 'install_tool_dependencies', False )
new_tool_panel_section_label = payload.get( 'new_tool_panel_section_label', '' )
shed_tool_conf = payload.get( 'shed_tool_conf', None )
- tool_path = payload.get( 'tool_path', None )
tool_panel_section_id = payload.get( 'tool_panel_section_id', '' )
all_installed_tool_shed_repositories = []
for index, tool_shed_url in enumerate( tool_shed_urls ):
@@ -451,7 +450,6 @@
:param owner (required): the owner of the Repository
:param changset_revision (required): the changset_revision of the RepositoryMetadata object associated with the Repository
"""
- api_key = kwd.get( 'key', None )
# Get the information about the repository to be installed from the payload.
tool_shed_url = payload.get( 'tool_shed_url', '' )
if not tool_shed_url:
@@ -471,7 +469,6 @@
ordered_tsr_ids = repair_dict.get( 'ordered_tsr_ids', [] )
ordered_repo_info_dicts = repair_dict.get( 'ordered_repo_info_dicts', [] )
if ordered_tsr_ids and ordered_repo_info_dicts:
- repositories_for_repair = []
for index, tsr_id in enumerate( ordered_tsr_ids ):
repository = trans.sa_session.query( trans.model.ToolShedRepository ).get( trans.security.decode_id( tsr_id ) )
repo_info_dict = ordered_repo_info_dicts[ index ]
@@ -496,7 +493,7 @@
PUT /api/tool_shed_repositories/reset_metadata_on_installed_repositories
Resets all metadata on all repositories installed into Galaxy in an "orderly fashion".
-
+
:param key: the API key of the Galaxy admin user.
"""
try:
https://bitbucket.org/galaxy/galaxy-central/commits/23c9b4736570/
Changeset: 23c9b4736570
User: dannon
Date: 2013-10-25 06:39:47
Summary: Simplify undelete_quota, there are no params.
Affected #: 2 files
diff -r 404f0c60e9bd45654bef1ea68bc64a26c0811bca -r 23c9b4736570957fd2d54bde163d99943efb2e1c lib/galaxy/actions/admin.py
--- a/lib/galaxy/actions/admin.py
+++ b/lib/galaxy/actions/admin.py
@@ -150,7 +150,7 @@
message += ', '.join( names )
return message
- def _undelete_quota( self, quota, params ):
+ def _undelete_quota( self, quota ):
quotas = util.listify( quota )
names = []
for q in quotas:
diff -r 404f0c60e9bd45654bef1ea68bc64a26c0811bca -r 23c9b4736570957fd2d54bde163d99943efb2e1c lib/galaxy/webapps/galaxy/api/quotas.py
--- a/lib/galaxy/webapps/galaxy/api/quotas.py
+++ b/lib/galaxy/webapps/galaxy/api/quotas.py
@@ -5,7 +5,6 @@
from galaxy.web.base.controller import BaseAPIController, UsesQuotaMixin, url_for
from galaxy.web.base.controllers.admin import Admin
from galaxy import web, util
-from elementtree.ElementTree import XML
from galaxy.web.params import QuotaParamParser
from galaxy.actions.admin import AdminActions
@@ -140,8 +139,7 @@
Undeletes a quota
"""
quota = self.get_quota( trans, id, deleted=True )
- params = self.get_quota_params( payload )
try:
- return self._undelete_quota( quota, params )
+ return self._undelete_quota( quota )
except ActionInputError, e:
raise HTTPBadRequest( detail=str( e ) )
https://bitbucket.org/galaxy/galaxy-central/commits/5a259fa8aabb/
Changeset: 5a259fa8aabb
User: dannon
Date: 2013-10-25 06:40:01
Summary: Merge with central
Affected #: 6 files
diff -r 23c9b4736570957fd2d54bde163d99943efb2e1c -r 5a259fa8aabb4d977c5ed091d1658b9674d0e67c lib/galaxy/webapps/galaxy/api/tools.py
--- a/lib/galaxy/webapps/galaxy/api/tools.py
+++ b/lib/galaxy/webapps/galaxy/api/tools.py
@@ -119,14 +119,18 @@
return { "message": { "type": "error", "data" : vars[ 'errors' ] } }
# TODO: check for errors and ensure that output dataset(s) are available.
- output_datasets = vars.get( 'out_data', {} ).values()
+ output_datasets = vars.get( 'out_data', {} ).iteritems()
rval = {
"outputs": []
}
outputs = rval[ "outputs" ]
#TODO:?? poss. only return ids?
- for output in output_datasets:
+ for output_name, output in output_datasets:
output_dict = output.to_dict()
+ #add the output name back into the output data structure
+ #so it's possible to figure out which newly created elements
+ #correspond with which tool file outputs
+ output_dict['output_name'] = output_name
outputs.append( trans.security.encode_dict_ids( output_dict ) )
return rval
diff -r 23c9b4736570957fd2d54bde163d99943efb2e1c -r 5a259fa8aabb4d977c5ed091d1658b9674d0e67c lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -341,10 +341,13 @@
elif action_type == 'setup_r_environment':
# setup an R environment
# <action type="setup_r_environment">
- # <r_base name="package_r_3_0_1" owner="bgruening" />
+ # <repository name="package_r_3_0_1" owner="bgruening">
+ # <package name="R" version="3.0.1" />
+ # </repository>
+ # <!-- allow installing an R packages -->
+ # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
# </action>
- # allow downloading and installing an R package
- # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
+
if action_dict.get( 'env_shell_file_paths', False ):
install_environment.add_env_shell_file_paths( action_dict[ 'env_shell_file_paths' ] )
else:
@@ -371,10 +374,73 @@
# R libraries are installed to $INSTALL_DIR (install_dir), we now set the R_LIBS path to that directory
# TODO: That code is used a lot for the different environments and should be refactored, once the environments are integrated
modify_env_command_dict = dict( name="R_LIBS", action="prepend_to", value=install_dir )
- modify_env_command = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
- return_code = handle_command( app, tool_dependency, install_dir, modify_env_command )
+ env_entry, env_file = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
+ return_code = file_append( env_entry, env_file, skip_if_contained=True, make_executable=True )
+
if return_code:
return
+ elif action_type == 'setup_ruby_environment':
+ # setup an Ruby environment
+ # <action type="setup_ruby_environment">
+ # <repository name="package_ruby_2_0" owner="bgruening">
+ # <package name="ruby" version="2.0" />
+ # </repository>
+ # <!-- allow downloading and installing an Ruby package from http://rubygems.org/ -->
+ # <package>protk</package>
+ # <package>protk=1.2.4</package>
+ # <package>http://url-to-some-gem-file.de/protk.gem</package>
+ # </action>
+ if action_dict.get( 'env_shell_file_paths', False ):
+ install_environment.add_env_shell_file_paths( action_dict[ 'env_shell_file_paths' ] )
+ else:
+ log.warning( 'Missing Ruby environment. Please check if your specified Ruby installation exists.' )
+ return
+
+ dir = os.path.curdir
+ current_dir = os.path.abspath( os.path.join( work_dir, dir ) )
+ with lcd( current_dir ):
+ with settings( warn_only=True ):
+ for (gem, gem_version) in action_dict[ 'ruby_packages' ]:
+ if os.path.isfile( gem ):
+ # we assume a local shipped gem file
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install --local %s''' % ( gem )
+ elif gem.find('://') != -1:
+ # we assume a URL to a gem file
+ url = gem
+ gem_name = url.split( '/' )[ -1 ]
+ td_common_util.url_download( work_dir, gem_name, url, extract=False )
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install --local %s ''' % ( gem_name )
+ else:
+ # gem file from rubygems.org with or without version number
+ if gem_version:
+ # version number was specified
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install %s --version "=%s"''' % ( gem, gem_version)
+ else:
+ # no version number given
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install %s''' % ( gem )
+ cmd = install_environment.build_command( td_common_util.evaluate_template( cmd, install_dir ) )
+ return_code = handle_command( app, tool_dependency, install_dir, cmd )
+ if return_code:
+ return
+
+ # Ruby libraries are installed to $INSTALL_DIR (install_dir), we now set the GEM_PATH path to that directory
+ # TODO: That code is used a lot for the different environments and should be refactored, once the environments are integrated
+ modify_env_command_dict = dict( name="GEM_PATH", action="prepend_to", value=install_dir )
+ env_entry, env_file = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
+ return_code = file_append( env_entry, env_file, skip_if_contained=True, make_executable=True )
+ if return_code:
+ return
+
+ modify_env_command_dict = dict( name="PATH", action="prepend_to", value=os.path.join(install_dir, 'bin') )
+ env_entry, env_file = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
+ return_code = file_append( env_entry, env_file, skip_if_contained=True, make_executable=True )
+ if return_code:
+ return
+
else:
# We're handling a complex repository dependency where we only have a set_environment tag set.
# <action type="set_environment">
diff -r 23c9b4736570957fd2d54bde163d99943efb2e1c -r 5a259fa8aabb4d977c5ed091d1658b9674d0e67c lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -613,10 +613,13 @@
action_dict[ 'configure_opts' ] = configure_opts
elif action_type == 'setup_r_environment':
# setup an R environment
- # <action type="setup_r_environment" name="package_r_3_0_1" owner="bgruening">
- # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
+ # <action type="setup_r_environment">
+ # <repository name="package_r_3_0_1" owner="bgruening">
+ # <package name="R" version="3.0.1" />
+ # </repository>
+ # <!-- allow installing an R packages -->
+ # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
# </action>
-
env_shell_file_paths = td_common_util.get_env_shell_file_paths( app, action_elem.find('repository') )
all_env_shell_file_paths.extend( env_shell_file_paths )
@@ -631,6 +634,46 @@
action_dict[ 'r_packages' ] = r_packages
else:
continue
+ elif action_type == 'setup_ruby_environment':
+ # setup an Ruby environment
+ # <action type="setup_ruby_environment">
+ # <repository name="package_ruby_2_0" owner="bgruening">
+ # <package name="ruby" version="2.0" />
+ # </repository>
+ # <!-- allow downloading and installing an Ruby package from http://rubygems.org/ -->
+ # <package>protk</package>
+ # <package>protk=1.2.4</package>
+ # <package>http://url-to-some-gem-file.de/protk.gem</package>
+ # </action>
+
+ env_shell_file_paths = td_common_util.get_env_shell_file_paths( app, action_elem.find('repository') )
+ all_env_shell_file_paths.extend( env_shell_file_paths )
+ if all_env_shell_file_paths:
+ action_dict[ 'env_shell_file_paths' ] = all_env_shell_file_paths
+ ruby_packages = list()
+ for env_elem in action_elem:
+ if env_elem.tag == 'package':
+ """
+ A valid gem definition can be:
+ protk=1.2.4
+ protk
+ ftp://ftp.gruening.de/protk.gem
+ """
+ gem_token = env_elem.text.strip().split('=')
+ if len(gem_token) == 2:
+ # version string
+ gem_name = gem_token[0]
+ gem_version = gem_token[1]
+ ruby_packages.append( [gem_name, gem_version] )
+ else:
+ # gem name for rubygems.org without version number
+ gem = env_elem.text.strip()
+ ruby_packages.append( [gem, None] )
+
+ if ruby_packages:
+ action_dict[ 'ruby_packages' ] = ruby_packages
+ else:
+ continue
elif action_type == 'make_install':
# make; make install; allow providing make options
if action_elem.text:
diff -r 23c9b4736570957fd2d54bde163d99943efb2e1c -r 5a259fa8aabb4d977c5ed091d1658b9674d0e67c test-data/extract_genomic_dna_out2.fasta
--- a/test-data/extract_genomic_dna_out2.fasta
+++ b/test-data/extract_genomic_dna_out2.fasta
@@ -1,6 +1,6 @@
->droPer1_super_1_139823_139913_-
+>droPer1_super_1_139823_139913_- AK028861
CGTCGGCTTCTGCTTCTGCTGATGATGGTCGTTCTTCTTCCTTTACTTCT
TCCTATTTTTCTTCCTTCCCTTACACTATATCTTCCTTTA
->droPer1_super_1_156750_156844_-
+>droPer1_super_1_156750_156844_- BC126698
CCGGGCTGCGGCAAGGGATTCACCTGCTCCAAACAGCTCAAGGTGCACTC
CCGCACGCACACGGGCGAGAAGCCCTATCACTGCGACATCTGCT
diff -r 23c9b4736570957fd2d54bde163d99943efb2e1c -r 5a259fa8aabb4d977c5ed091d1658b9674d0e67c tools/extract/extract_genomic_dna.py
--- a/tools/extract/extract_genomic_dna.py
+++ b/tools/extract/extract_genomic_dna.py
@@ -1,7 +1,7 @@
#!/usr/bin/env python
"""
usage: %prog $input $out_file1
- -1, --cols=N,N,N,N: Columns for start, end, strand in input file
+ -1, --cols=N,N,N,N,N: Columns for start, end, strand in input file
-d, --dbkey=N: Genome build of input file
-o, --output_format=N: the data type of the output file
-g, --GALAXY_DATA_INDEX_DIR=N: the directory containing alignseq.loc
@@ -54,7 +54,13 @@
#
options, args = doc_optparse.parse( __doc__ )
try:
- chrom_col, start_col, end_col, strand_col = parse_cols_arg( options.cols )
+ if len(options.cols.split(',')) == 5:
+ # BED file
+ chrom_col, start_col, end_col, strand_col, name_col = parse_cols_arg( options.cols )
+ else:
+ # gff file
+ chrom_col, start_col, end_col, strand_col = parse_cols_arg( options.cols )
+ name_col = False
dbkey = options.dbkey
output_format = options.output_format
gff_format = options.gff
@@ -136,7 +142,8 @@
if isinstance( feature, ( Header, Comment ) ):
line_count += 1
continue
-
+
+ name = ""
if gff_format and interpret_features:
# Processing features.
gff_util.convert_gff_coords_to_bed( feature )
@@ -153,6 +160,8 @@
chrom = fields[chrom_col]
start = int( fields[start_col] )
end = int( fields[end_col] )
+ if name_col:
+ name = fields[name_col]
if gff_format:
start, end = gff_util.convert_gff_coords_to_bed( [start, end] )
if includes_strand_col:
@@ -237,13 +246,16 @@
sequence = reverse_complement( sequence )
if output_format == "fasta" :
- l = len( sequence )
+ l = len( sequence )
c = 0
if gff_format:
start, end = gff_util.convert_bed_coords_to_gff( [ start, end ] )
fields = [dbkey, str( chrom ), str( start ), str( end ), strand]
meta_data = "_".join( fields )
- fout.write( ">%s\n" % meta_data )
+ if name.strip():
+ fout.write( ">%s %s\n" % (meta_data, name) )
+ else:
+ fout.write( ">%s\n" % meta_data )
while c < l:
b = min( c + 50, l )
fout.write( "%s\n" % str( sequence[c:b] ) )
diff -r 23c9b4736570957fd2d54bde163d99943efb2e1c -r 5a259fa8aabb4d977c5ed091d1658b9674d0e67c tools/extract/extract_genomic_dna.xml
--- a/tools/extract/extract_genomic_dna.xml
+++ b/tools/extract/extract_genomic_dna.xml
@@ -1,4 +1,4 @@
-<tool id="Extract genomic DNA 1" name="Extract Genomic DNA" version="2.2.2">
+<tool id="Extract genomic DNA 1" name="Extract Genomic DNA" version="2.2.3"><description>using coordinates from assembled/unassembled genomes</description><command interpreter="python">
extract_genomic_dna.py $input $out_file1 -o $out_format -d $dbkey
@@ -11,9 +11,9 @@
#if isinstance( $input.datatype, $__app__.datatypes_registry.get_datatype_by_extension('gff').__class__):
-1 1,4,5,7 --gff
#else:
- -1 ${input.metadata.chromCol},${input.metadata.startCol},${input.metadata.endCol},${input.metadata.strandCol}
+ -1 ${input.metadata.chromCol},${input.metadata.startCol},${input.metadata.endCol},${input.metadata.strandCol},${input.metadata.nameCol}
#end if
-
+
#if $seq_source.index_source == "cached":
## Genomic data from cache.
-g ${GALAXY_DATA_INDEX_DIR}
@@ -52,16 +52,30 @@
</data></outputs><requirements>
+ <requirement type="package">ucsc_tools</requirement><requirement type="binary">faToTwoBit</requirement>
- <requirement type="package">ucsc_tools</requirement></requirements><tests><test><param name="input" value="1.bed" dbkey="hg17" ftype="bed" /><param name="interpret_features" value="yes"/><param name="index_source" value="cached"/>
- <param name="out_format" value="fasta"/>
- <output name="out_file1" file="extract_genomic_dna_out1.fasta" />
+ <param name="out_format" value="fasta"/>
+ <output name="out_file1">
+ <assert_contents>
+ <!-- First few lines... -->
+ <has_text text=">hg17_chr1_147962192_147962580_- CCDS989.1_cds_0_0_chr1_147962193_r" />
+ <has_text text="ACTTGATCCTGCTCCCTCGGTGTCTGCATTGACTCCTCATGCTGGGACTG" />
+ <has_text text="GACCCGTCAACCCCCCTGCTCGCTGCTCACGTACCTTCATCACTTTTAGT" />
+ <has_text text="GATGATGCAACTTTCGAGGAATGGTTCCCCCAAGGGCGGCCCCCAAAAGT" />
+ <!-- Last few lines... -->
+ <has_text text="GCTGTGGCACAGAACATGGACTCTGTGTTTAAGGAGCTCTTGGGAAAGAC" />
+ <has_text text="CTCTGTCCGCCAGGGCCTTGGGCCAGCATCTACCACCTCTCCCAGTCCTG" />
+ <has_text text="GGCCCCGAAGCCCAAAGGCCCCGCCCAGCAGCCGCCTGGGCAGGAACAAA" />
+ <has_text text="GGCTTCTCCCGGGGCCCTGGGGCCCCAGCCTCACCCTCAGCTTCCCACCC" />
+ <has_text text="CCAGGGCCTAGACACGACCCCCAAGCCACACTGA" />
+ </assert_contents>
+ </output></test><test><param name="input" value="droPer1.bed" dbkey="droPer1" ftype="bed" />
@@ -152,14 +166,14 @@
Extracting sequences with **FASTA** output data type returns::
- >hg17_chr7_127475281_127475310_+
+ >hg17_chr7_127475281_127475310_+ NM_000230
GTAGGAATCGCAGCGCCAGCGGTTGCAAG
- >hg17_chr7_127485994_127486166_+
+ >hg17_chr7_127485994_127486166_+ NM_000230
GCCCAAGAAGCCCATCCTGGGAAGGAAAATGCATTGGGGAACCCTGTGCG
GATTCTTGTGGCTTTGGCCCTATCTTTTCTATGTCCAAGCTGTGCCCATC
CAAAAAGTCCAAGATGACACCAAAACCCTCATCAAGACAATTGTCACCAG
GATCAATGACATTTCACACACG
- >hg17_chr7_127486011_127486166_+
+ >hg17_chr7_127486011_127486166_+ D49487
TGGGAAGGAAAATGCATTGGGGAACCCTGTGCGGATTCTTGTGGCTTTGG
CCCTATCTTTTCTATGTCCAAGCTGTGCCCATCCAAAAAGTCCAAGATGA
CACCAAAACCCTCATCAAGACAATTGTCACCAGGATCAATGACATTTCAC
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
6 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/1ce68c239d0e/
Changeset: 1ce68c239d0e
Branch: extract_genomic_dna_tool_enhancements
User: BjoernGruening
Date: 2013-10-18 18:31:44
Summary: Add the value (nameCol) in a given BED file to the FASTA header.
Affected #: 2 files
diff -r febd7622924885dd0729ce00924289cc1f0eb741 -r 1ce68c239d0ee59bd469c4520bf754eaa80ac04c tools/extract/extract_genomic_dna.py
--- a/tools/extract/extract_genomic_dna.py
+++ b/tools/extract/extract_genomic_dna.py
@@ -1,7 +1,7 @@
#!/usr/bin/env python
"""
usage: %prog $input $out_file1
- -1, --cols=N,N,N,N: Columns for start, end, strand in input file
+ -1, --cols=N,N,N,N,N: Columns for start, end, strand in input file
-d, --dbkey=N: Genome build of input file
-o, --output_format=N: the data type of the output file
-g, --GALAXY_DATA_INDEX_DIR=N: the directory containing alignseq.loc
@@ -54,7 +54,13 @@
#
options, args = doc_optparse.parse( __doc__ )
try:
- chrom_col, start_col, end_col, strand_col = parse_cols_arg( options.cols )
+ if len(options.cols.split(',')) == 5:
+ # BED file
+ chrom_col, start_col, end_col, strand_col, name_col = parse_cols_arg( options.cols )
+ else:
+ # gff file
+ chrom_col, start_col, end_col, strand_col = parse_cols_arg( options.cols )
+ name_col = False
dbkey = options.dbkey
output_format = options.output_format
gff_format = options.gff
@@ -144,6 +150,7 @@
start = feature.start
end = feature.end
strand = feature.strand
+ name = ""
else:
# Processing lines, either interval or GFF format.
line = feature.rstrip( '\r\n' )
@@ -153,6 +160,8 @@
chrom = fields[chrom_col]
start = int( fields[start_col] )
end = int( fields[end_col] )
+ if name_col:
+ name = fields[name_col]
if gff_format:
start, end = gff_util.convert_gff_coords_to_bed( [start, end] )
if includes_strand_col:
@@ -237,13 +246,16 @@
sequence = reverse_complement( sequence )
if output_format == "fasta" :
- l = len( sequence )
+ l = len( sequence )
c = 0
if gff_format:
start, end = gff_util.convert_bed_coords_to_gff( [ start, end ] )
fields = [dbkey, str( chrom ), str( start ), str( end ), strand]
meta_data = "_".join( fields )
- fout.write( ">%s\n" % meta_data )
+ if name.strip():
+ fout.write( ">%s %s\n" % (meta_data, name) )
+ else:
+ fout.write( ">%s\n" % meta_data )
while c < l:
b = min( c + 50, l )
fout.write( "%s\n" % str( sequence[c:b] ) )
diff -r febd7622924885dd0729ce00924289cc1f0eb741 -r 1ce68c239d0ee59bd469c4520bf754eaa80ac04c tools/extract/extract_genomic_dna.xml
--- a/tools/extract/extract_genomic_dna.xml
+++ b/tools/extract/extract_genomic_dna.xml
@@ -11,9 +11,9 @@
#if isinstance( $input.datatype, $__app__.datatypes_registry.get_datatype_by_extension('gff').__class__):
-1 1,4,5,7 --gff
#else:
- -1 ${input.metadata.chromCol},${input.metadata.startCol},${input.metadata.endCol},${input.metadata.strandCol}
+ -1 ${input.metadata.chromCol},${input.metadata.startCol},${input.metadata.endCol},${input.metadata.strandCol},${input.metadata.nameCol}
#end if
-
+
#if $seq_source.index_source == "cached":
## Genomic data from cache.
-g ${GALAXY_DATA_INDEX_DIR}
@@ -52,8 +52,8 @@
</data></outputs><requirements>
+ <requirement type="package">ucsc_tools</requirement><requirement type="binary">faToTwoBit</requirement>
- <requirement type="package">ucsc_tools</requirement></requirements><tests><test>
https://bitbucket.org/galaxy/galaxy-central/commits/9c6732931369/
Changeset: 9c6732931369
Branch: extract_genomic_dna_tool_enhancements
User: BjoernGruening
Date: 2013-10-24 18:13:18
Summary: bugfix and one test fix
Affected #: 3 files
diff -r 1ce68c239d0ee59bd469c4520bf754eaa80ac04c -r 9c67329313699a02108fe36dc0cee93fee362c70 test-data/extract_genomic_dna_out2.fasta
--- a/test-data/extract_genomic_dna_out2.fasta
+++ b/test-data/extract_genomic_dna_out2.fasta
@@ -1,6 +1,6 @@
->droPer1_super_1_139823_139913_-
+>droPer1_super_1_139823_139913_- AK028861
CGTCGGCTTCTGCTTCTGCTGATGATGGTCGTTCTTCTTCCTTTACTTCT
TCCTATTTTTCTTCCTTCCCTTACACTATATCTTCCTTTA
->droPer1_super_1_156750_156844_-
+>droPer1_super_1_156750_156844_- BC126698
CCGGGCTGCGGCAAGGGATTCACCTGCTCCAAACAGCTCAAGGTGCACTC
CCGCACGCACACGGGCGAGAAGCCCTATCACTGCGACATCTGCT
diff -r 1ce68c239d0ee59bd469c4520bf754eaa80ac04c -r 9c67329313699a02108fe36dc0cee93fee362c70 tools/extract/extract_genomic_dna.py
--- a/tools/extract/extract_genomic_dna.py
+++ b/tools/extract/extract_genomic_dna.py
@@ -142,7 +142,8 @@
if isinstance( feature, ( Header, Comment ) ):
line_count += 1
continue
-
+
+ name = ""
if gff_format and interpret_features:
# Processing features.
gff_util.convert_gff_coords_to_bed( feature )
@@ -150,7 +151,6 @@
start = feature.start
end = feature.end
strand = feature.strand
- name = ""
else:
# Processing lines, either interval or GFF format.
line = feature.rstrip( '\r\n' )
diff -r 1ce68c239d0ee59bd469c4520bf754eaa80ac04c -r 9c67329313699a02108fe36dc0cee93fee362c70 tools/extract/extract_genomic_dna.xml
--- a/tools/extract/extract_genomic_dna.xml
+++ b/tools/extract/extract_genomic_dna.xml
@@ -60,7 +60,7 @@
<param name="input" value="1.bed" dbkey="hg17" ftype="bed" /><param name="interpret_features" value="yes"/><param name="index_source" value="cached"/>
- <param name="out_format" value="fasta"/>
+ <param name="out_format" value="fasta"/><output name="out_file1" file="extract_genomic_dna_out1.fasta" /></test><test>
https://bitbucket.org/galaxy/galaxy-central/commits/af3fc6046bd6/
Changeset: af3fc6046bd6
Branch: extract_genomic_dna_tool_enhancements
User: jmchilton
Date: 2013-10-25 06:03:57
Summary: Update hg17 test case for extract_genomic_dna.xml tor reflect recent changes.
Cannot delete or modify previous output file - it is used as input for another tool. Use assertion testing to weaken test case. As this tool is migrated to the tool shed the hg17 test cases should probably be eliminated completely.
Affected #: 1 file
diff -r 9c67329313699a02108fe36dc0cee93fee362c70 -r af3fc6046bd693fdbf1ec26d119bf387d7a92bc2 tools/extract/extract_genomic_dna.xml
--- a/tools/extract/extract_genomic_dna.xml
+++ b/tools/extract/extract_genomic_dna.xml
@@ -61,7 +61,21 @@
<param name="interpret_features" value="yes"/><param name="index_source" value="cached"/><param name="out_format" value="fasta"/>
- <output name="out_file1" file="extract_genomic_dna_out1.fasta" />
+ <output name="out_file1">
+ <assert_contents>
+ <!-- First few lines... -->
+ <has_text text=">hg17_chr1_147962192_147962580_- CCDS989.1_cds_0_0_chr1_147962193_r" />
+ <has_text text="ACTTGATCCTGCTCCCTCGGTGTCTGCATTGACTCCTCATGCTGGGACTG" />
+ <has_text text="GACCCGTCAACCCCCCTGCTCGCTGCTCACGTACCTTCATCACTTTTAGT" />
+ <has_text text="GATGATGCAACTTTCGAGGAATGGTTCCCCCAAGGGCGGCCCCCAAAAGT" />
+ <!-- Last few lines... -->
+ <has_text text="GCTGTGGCACAGAACATGGACTCTGTGTTTAAGGAGCTCTTGGGAAAGAC" />
+ <has_text text="CTCTGTCCGCCAGGGCCTTGGGCCAGCATCTACCACCTCTCCCAGTCCTG" />
+ <has_text text="GGCCCCGAAGCCCAAAGGCCCCGCCCAGCAGCCGCCTGGGCAGGAACAAA" />
+ <has_text text="GGCTTCTCCCGGGGCCCTGGGGCCCCAGCCTCACCCTCAGCTTCCCACCC" />
+ <has_text text="CCAGGGCCTAGACACGACCCCCAAGCCACACTGA" />
+ </assert_contents>
+ </output></test><test><param name="input" value="droPer1.bed" dbkey="droPer1" ftype="bed" />
https://bitbucket.org/galaxy/galaxy-central/commits/7b014440dda1/
Changeset: 7b014440dda1
Branch: extract_genomic_dna_tool_enhancements
User: jmchilton
Date: 2013-10-25 06:06:14
Summary: Rev slightly the tool version of extract_genomic_dna.xml since output is slightly different.
Update tool help to reflect new output format.
Affected #: 1 file
diff -r af3fc6046bd693fdbf1ec26d119bf387d7a92bc2 -r 7b014440dda1d9dddb292598fb804510b5274438 tools/extract/extract_genomic_dna.xml
--- a/tools/extract/extract_genomic_dna.xml
+++ b/tools/extract/extract_genomic_dna.xml
@@ -1,4 +1,4 @@
-<tool id="Extract genomic DNA 1" name="Extract Genomic DNA" version="2.2.2">
+<tool id="Extract genomic DNA 1" name="Extract Genomic DNA" version="2.2.3"><description>using coordinates from assembled/unassembled genomes</description><command interpreter="python">
extract_genomic_dna.py $input $out_file1 -o $out_format -d $dbkey
@@ -166,14 +166,14 @@
Extracting sequences with **FASTA** output data type returns::
- >hg17_chr7_127475281_127475310_+
+ >hg17_chr7_127475281_127475310_+ NM_000230
GTAGGAATCGCAGCGCCAGCGGTTGCAAG
- >hg17_chr7_127485994_127486166_+
+ >hg17_chr7_127485994_127486166_+ NM_000230
GCCCAAGAAGCCCATCCTGGGAAGGAAAATGCATTGGGGAACCCTGTGCG
GATTCTTGTGGCTTTGGCCCTATCTTTTCTATGTCCAAGCTGTGCCCATC
CAAAAAGTCCAAGATGACACCAAAACCCTCATCAAGACAATTGTCACCAG
GATCAATGACATTTCACACACG
- >hg17_chr7_127486011_127486166_+
+ >hg17_chr7_127486011_127486166_+ D49487
TGGGAAGGAAAATGCATTGGGGAACCCTGTGCGGATTCTTGTGGCTTTGG
CCCTATCTTTTCTATGTCCAAGCTGTGCCCATCCAAAAAGTCCAAGATGA
CACCAAAACCCTCATCAAGACAATTGTCACCAGGATCAATGACATTTCAC
https://bitbucket.org/galaxy/galaxy-central/commits/57e9ebe8ea6b/
Changeset: 57e9ebe8ea6b
User: jmchilton
Date: 2013-10-25 06:08:48
Summary: Merge pull request #239 changes into default.
Affected #: 3 files
diff -r d4e60067889e2bd8873010fa0fc887e725a4b695 -r 57e9ebe8ea6ba6686a489560cde62b309d2e5271 test-data/extract_genomic_dna_out2.fasta
--- a/test-data/extract_genomic_dna_out2.fasta
+++ b/test-data/extract_genomic_dna_out2.fasta
@@ -1,6 +1,6 @@
->droPer1_super_1_139823_139913_-
+>droPer1_super_1_139823_139913_- AK028861
CGTCGGCTTCTGCTTCTGCTGATGATGGTCGTTCTTCTTCCTTTACTTCT
TCCTATTTTTCTTCCTTCCCTTACACTATATCTTCCTTTA
->droPer1_super_1_156750_156844_-
+>droPer1_super_1_156750_156844_- BC126698
CCGGGCTGCGGCAAGGGATTCACCTGCTCCAAACAGCTCAAGGTGCACTC
CCGCACGCACACGGGCGAGAAGCCCTATCACTGCGACATCTGCT
diff -r d4e60067889e2bd8873010fa0fc887e725a4b695 -r 57e9ebe8ea6ba6686a489560cde62b309d2e5271 tools/extract/extract_genomic_dna.py
--- a/tools/extract/extract_genomic_dna.py
+++ b/tools/extract/extract_genomic_dna.py
@@ -1,7 +1,7 @@
#!/usr/bin/env python
"""
usage: %prog $input $out_file1
- -1, --cols=N,N,N,N: Columns for start, end, strand in input file
+ -1, --cols=N,N,N,N,N: Columns for start, end, strand in input file
-d, --dbkey=N: Genome build of input file
-o, --output_format=N: the data type of the output file
-g, --GALAXY_DATA_INDEX_DIR=N: the directory containing alignseq.loc
@@ -54,7 +54,13 @@
#
options, args = doc_optparse.parse( __doc__ )
try:
- chrom_col, start_col, end_col, strand_col = parse_cols_arg( options.cols )
+ if len(options.cols.split(',')) == 5:
+ # BED file
+ chrom_col, start_col, end_col, strand_col, name_col = parse_cols_arg( options.cols )
+ else:
+ # gff file
+ chrom_col, start_col, end_col, strand_col = parse_cols_arg( options.cols )
+ name_col = False
dbkey = options.dbkey
output_format = options.output_format
gff_format = options.gff
@@ -136,7 +142,8 @@
if isinstance( feature, ( Header, Comment ) ):
line_count += 1
continue
-
+
+ name = ""
if gff_format and interpret_features:
# Processing features.
gff_util.convert_gff_coords_to_bed( feature )
@@ -153,6 +160,8 @@
chrom = fields[chrom_col]
start = int( fields[start_col] )
end = int( fields[end_col] )
+ if name_col:
+ name = fields[name_col]
if gff_format:
start, end = gff_util.convert_gff_coords_to_bed( [start, end] )
if includes_strand_col:
@@ -237,13 +246,16 @@
sequence = reverse_complement( sequence )
if output_format == "fasta" :
- l = len( sequence )
+ l = len( sequence )
c = 0
if gff_format:
start, end = gff_util.convert_bed_coords_to_gff( [ start, end ] )
fields = [dbkey, str( chrom ), str( start ), str( end ), strand]
meta_data = "_".join( fields )
- fout.write( ">%s\n" % meta_data )
+ if name.strip():
+ fout.write( ">%s %s\n" % (meta_data, name) )
+ else:
+ fout.write( ">%s\n" % meta_data )
while c < l:
b = min( c + 50, l )
fout.write( "%s\n" % str( sequence[c:b] ) )
diff -r d4e60067889e2bd8873010fa0fc887e725a4b695 -r 57e9ebe8ea6ba6686a489560cde62b309d2e5271 tools/extract/extract_genomic_dna.xml
--- a/tools/extract/extract_genomic_dna.xml
+++ b/tools/extract/extract_genomic_dna.xml
@@ -1,4 +1,4 @@
-<tool id="Extract genomic DNA 1" name="Extract Genomic DNA" version="2.2.2">
+<tool id="Extract genomic DNA 1" name="Extract Genomic DNA" version="2.2.3"><description>using coordinates from assembled/unassembled genomes</description><command interpreter="python">
extract_genomic_dna.py $input $out_file1 -o $out_format -d $dbkey
@@ -11,9 +11,9 @@
#if isinstance( $input.datatype, $__app__.datatypes_registry.get_datatype_by_extension('gff').__class__):
-1 1,4,5,7 --gff
#else:
- -1 ${input.metadata.chromCol},${input.metadata.startCol},${input.metadata.endCol},${input.metadata.strandCol}
+ -1 ${input.metadata.chromCol},${input.metadata.startCol},${input.metadata.endCol},${input.metadata.strandCol},${input.metadata.nameCol}
#end if
-
+
#if $seq_source.index_source == "cached":
## Genomic data from cache.
-g ${GALAXY_DATA_INDEX_DIR}
@@ -52,16 +52,30 @@
</data></outputs><requirements>
+ <requirement type="package">ucsc_tools</requirement><requirement type="binary">faToTwoBit</requirement>
- <requirement type="package">ucsc_tools</requirement></requirements><tests><test><param name="input" value="1.bed" dbkey="hg17" ftype="bed" /><param name="interpret_features" value="yes"/><param name="index_source" value="cached"/>
- <param name="out_format" value="fasta"/>
- <output name="out_file1" file="extract_genomic_dna_out1.fasta" />
+ <param name="out_format" value="fasta"/>
+ <output name="out_file1">
+ <assert_contents>
+ <!-- First few lines... -->
+ <has_text text=">hg17_chr1_147962192_147962580_- CCDS989.1_cds_0_0_chr1_147962193_r" />
+ <has_text text="ACTTGATCCTGCTCCCTCGGTGTCTGCATTGACTCCTCATGCTGGGACTG" />
+ <has_text text="GACCCGTCAACCCCCCTGCTCGCTGCTCACGTACCTTCATCACTTTTAGT" />
+ <has_text text="GATGATGCAACTTTCGAGGAATGGTTCCCCCAAGGGCGGCCCCCAAAAGT" />
+ <!-- Last few lines... -->
+ <has_text text="GCTGTGGCACAGAACATGGACTCTGTGTTTAAGGAGCTCTTGGGAAAGAC" />
+ <has_text text="CTCTGTCCGCCAGGGCCTTGGGCCAGCATCTACCACCTCTCCCAGTCCTG" />
+ <has_text text="GGCCCCGAAGCCCAAAGGCCCCGCCCAGCAGCCGCCTGGGCAGGAACAAA" />
+ <has_text text="GGCTTCTCCCGGGGCCCTGGGGCCCCAGCCTCACCCTCAGCTTCCCACCC" />
+ <has_text text="CCAGGGCCTAGACACGACCCCCAAGCCACACTGA" />
+ </assert_contents>
+ </output></test><test><param name="input" value="droPer1.bed" dbkey="droPer1" ftype="bed" />
@@ -152,14 +166,14 @@
Extracting sequences with **FASTA** output data type returns::
- >hg17_chr7_127475281_127475310_+
+ >hg17_chr7_127475281_127475310_+ NM_000230
GTAGGAATCGCAGCGCCAGCGGTTGCAAG
- >hg17_chr7_127485994_127486166_+
+ >hg17_chr7_127485994_127486166_+ NM_000230
GCCCAAGAAGCCCATCCTGGGAAGGAAAATGCATTGGGGAACCCTGTGCG
GATTCTTGTGGCTTTGGCCCTATCTTTTCTATGTCCAAGCTGTGCCCATC
CAAAAAGTCCAAGATGACACCAAAACCCTCATCAAGACAATTGTCACCAG
GATCAATGACATTTCACACACG
- >hg17_chr7_127486011_127486166_+
+ >hg17_chr7_127486011_127486166_+ D49487
TGGGAAGGAAAATGCATTGGGGAACCCTGTGCGGATTCTTGTGGCTTTGG
CCCTATCTTTTCTATGTCCAAGCTGTGCCCATCCAAAAAGTCCAAGATGA
CACCAAAACCCTCATCAAGACAATTGTCACCAGGATCAATGACATTTCAC
https://bitbucket.org/galaxy/galaxy-central/commits/7286338fd77e/
Changeset: 7286338fd77e
Branch: extract_genomic_dna_tool_enhancements
User: jmchilton
Date: 2013-10-25 06:09:25
Summary: Close branch extract_genomic_dna_tool_enhancements.
Affected #: 0 files
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/91399e1ae5fc/
Changeset: 91399e1ae5fc
User: BjoernGruening
Date: 2013-10-23 13:45:33
Summary: Add setup_ruby_environment action type
Affected #: 2 files
diff -r 115a6924dc4c459467ca162e10ee24ba04001a1e -r 91399e1ae5fc827ddc9b33702fe401a10d367539 lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -375,6 +375,68 @@
return_code = handle_command( app, tool_dependency, install_dir, modify_env_command )
if return_code:
return
+ elif action_type == 'setup_ruby_environment':
+ # setup an Ruby environment
+ # <action type="setup_ruby_environment">
+ # <repository name="package_ruby_2_0" owner="bgruening">
+ # <package name="ruby" version="2.0" />
+ # </repository>
+ # <!-- allow downloading and installing an Ruby package from http://rubygems.org/ -->
+ # <package>protk</package>
+ # <package>protk=1.2.4</package>
+ # <package>http://url-to-some-gem-file.de/protk.gem</package>
+ # </action>
+ if action_dict.get( 'env_shell_file_paths', False ):
+ install_environment.add_env_shell_file_paths( action_dict[ 'env_shell_file_paths' ] )
+ else:
+ log.warning( 'Missing Ruby environment. Please check if your specified Ruby installation exists.' )
+ return
+
+ dir = os.path.curdir
+ current_dir = os.path.abspath( os.path.join( work_dir, dir ) )
+ with lcd( current_dir ):
+ with settings( warn_only=True ):
+ for (gem, gem_version) in action_dict[ 'ruby_packages' ]:
+ if os.path.isfile( gem ):
+ # we assume a local shipped gem file
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install --local %s''' % ( gem )
+ elif gem.find('://') != -1:
+ # we assume a URL to a gem file
+ url = gem
+ gem_name = url.split( '/' )[ -1 ]
+ td_common_util.url_download( work_dir, gem_name, url, extract=False )
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install --local %s ''' % ( gem_name )
+ else:
+ # gem file from rubygems.org with or without version number
+ if gem_version:
+ # version number was specified
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install %s --version "=%s"''' % ( gem, gem_version)
+ else:
+ # no version number given
+ cmd = '''export PATH=$PATH:$RUBY_HOME/bin && export GEM_HOME=$INSTALL_DIR &&
+ gem install %s''' % ( gem )
+ cmd = install_environment.build_command( td_common_util.evaluate_template( cmd, install_dir ) )
+ return_code = handle_command( app, tool_dependency, install_dir, cmd )
+ if return_code:
+ return
+
+ # Ruby libraries are installed to $INSTALL_DIR (install_dir), we now set the GEM_PATH path to that directory
+ # TODO: That code is used a lot for the different environments and should be refactored, once the environments are integrated
+ modify_env_command_dict = dict( name="GEM_PATH", action="prepend_to", value=install_dir )
+ env_entry, env_file = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
+ return_code = file_append( env_entry, env_file, skip_if_contained=True, make_executable=True )
+ if return_code:
+ return
+
+ modify_env_command_dict = dict( name="PATH", action="prepend_to", value=os.path.join(install_dir, 'bin') )
+ env_entry, env_file = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
+ return_code = file_append( env_entry, env_file, skip_if_contained=True, make_executable=True )
+ if return_code:
+ return
+
else:
# We're handling a complex repository dependency where we only have a set_environment tag set.
# <action type="set_environment">
diff -r 115a6924dc4c459467ca162e10ee24ba04001a1e -r 91399e1ae5fc827ddc9b33702fe401a10d367539 lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -631,6 +631,46 @@
action_dict[ 'r_packages' ] = r_packages
else:
continue
+ elif action_type == 'setup_ruby_environment':
+ # setup an Ruby environment
+ # <action type="setup_ruby_environment">
+ # <repository name="package_ruby_2_0" owner="bgruening">
+ # <package name="ruby" version="2.0" />
+ # </repository>
+ # <!-- allow downloading and installing an Ruby package from http://rubygems.org/ -->
+ # <package>protk</package>
+ # <package>protk=1.2.4</package>
+ # <package>http://url-to-some-gem-file.de/protk.gem</package>
+ # </action>
+
+ env_shell_file_paths = td_common_util.get_env_shell_file_paths( app, action_elem.find('repository') )
+ all_env_shell_file_paths.extend( env_shell_file_paths )
+ if all_env_shell_file_paths:
+ action_dict[ 'env_shell_file_paths' ] = all_env_shell_file_paths
+ ruby_packages = list()
+ for env_elem in action_elem:
+ if env_elem.tag == 'package':
+ """
+ A valid gem definition can be:
+ protk=1.2.4
+ protk
+ ftp://ftp.gruening.de/protk.gem
+ """
+ gem_token = env_elem.text.strip().split('=')
+ if len(gem_token) == 2:
+ # version string
+ gem_name = gem_token[0]
+ gem_version = gem_token[1]
+ ruby_packages.append( [gem_name, gem_version] )
+ else:
+ # gem name for rubygems.org without version number
+ gem = env_elem.text.strip()
+ ruby_packages.append( [gem, None] )
+
+ if ruby_packages:
+ action_dict[ 'ruby_packages' ] = ruby_packages
+ else:
+ continue
elif action_type == 'make_install':
# make; make install; allow providing make options
if action_elem.text:
https://bitbucket.org/galaxy/galaxy-central/commits/d4e60067889e/
Changeset: d4e60067889e
User: BjoernGruening
Date: 2013-10-23 14:39:49
Summary: cleanup & bugfix for R environment
Affected #: 2 files
diff -r 91399e1ae5fc827ddc9b33702fe401a10d367539 -r d4e60067889e2bd8873010fa0fc887e725a4b695 lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -341,10 +341,13 @@
elif action_type == 'setup_r_environment':
# setup an R environment
# <action type="setup_r_environment">
- # <r_base name="package_r_3_0_1" owner="bgruening" />
+ # <repository name="package_r_3_0_1" owner="bgruening">
+ # <package name="R" version="3.0.1" />
+ # </repository>
+ # <!-- allow installing an R packages -->
+ # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
# </action>
- # allow downloading and installing an R package
- # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
+
if action_dict.get( 'env_shell_file_paths', False ):
install_environment.add_env_shell_file_paths( action_dict[ 'env_shell_file_paths' ] )
else:
@@ -371,8 +374,9 @@
# R libraries are installed to $INSTALL_DIR (install_dir), we now set the R_LIBS path to that directory
# TODO: That code is used a lot for the different environments and should be refactored, once the environments are integrated
modify_env_command_dict = dict( name="R_LIBS", action="prepend_to", value=install_dir )
- modify_env_command = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
- return_code = handle_command( app, tool_dependency, install_dir, modify_env_command )
+ env_entry, env_file = td_common_util.create_or_update_env_shell_file( install_dir, modify_env_command_dict )
+ return_code = file_append( env_entry, env_file, skip_if_contained=True, make_executable=True )
+
if return_code:
return
elif action_type == 'setup_ruby_environment':
diff -r 91399e1ae5fc827ddc9b33702fe401a10d367539 -r d4e60067889e2bd8873010fa0fc887e725a4b695 lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -613,10 +613,13 @@
action_dict[ 'configure_opts' ] = configure_opts
elif action_type == 'setup_r_environment':
# setup an R environment
- # <action type="setup_r_environment" name="package_r_3_0_1" owner="bgruening">
- # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
+ # <action type="setup_r_environment">
+ # <repository name="package_r_3_0_1" owner="bgruening">
+ # <package name="R" version="3.0.1" />
+ # </repository>
+ # <!-- allow installing an R packages -->
+ # <package>https://github.com/bgruening/download_store/raw/master/DESeq2-1_0_18/BiocGe…</package>
# </action>
-
env_shell_file_paths = td_common_util.get_env_shell_file_paths( app, action_elem.find('repository') )
all_env_shell_file_paths.extend( env_shell_file_paths )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/21a13e4865b3/
Changeset: 21a13e4865b3
Branch: next-stable
User: jmchilton
Date: 2013-04-23 16:57:14
Summary: Add exception handling for Binary sniffers. Image sniffers can fail for large files because signed integers are used internally, this catches that and other potential problems.
Traceback (most recent call last):
File "/opt/galaxy/web/tools/data_source/upload.py", line 432, in <module>
__main__()
File "/opt/galaxy/web/tools/data_source/upload.py", line 421, in __main__
add_file( dataset, registry, json_file, output_path )
File "/opt/galaxy/web/tools/data_source/upload.py", line 155, in add_file
type_info = Binary.is_sniffable_binary( dataset.path )
File "/opt/galaxy/web/lib/galaxy/datatypes/binary.py", line 38, in is_sniffable_binary
if format["class"]().sniff(filename):
File "/opt/galaxy/web/lib/galaxy/datatypes/images.py", line 203, in sniff
headers = get_headers(filename, None, 1)
File "/opt/galaxy/web/lib/galaxy/datatypes/sniff.py", line 179, in get_headers
for idx, line in enumerate(file(fname)):
SystemError: Negative size passed to PyString_FromStringAndSize
Affected #: 1 file
diff -r 7266b5e09cb20cac801cddef87b0466ddc32d41e -r 21a13e4865b3f5d997dd34bc977eb5f998b43024 lib/galaxy/datatypes/binary.py
--- a/lib/galaxy/datatypes/binary.py
+++ b/lib/galaxy/datatypes/binary.py
@@ -42,11 +42,20 @@
Binary.unsniffable_binary_formats.append(ext)
@staticmethod
- def is_sniffable_binary(filename):
+ def is_sniffable_binary( filename ):
+ format_information = None
for format in Binary.sniffable_binary_formats:
- if format["class"]().sniff(filename):
- return (format["type"], format["ext"])
- return None
+ format_instance = format[ "class" ]()
+ try:
+ if format_instance.sniff(filename):
+ format_information = ( format["type"], format[ "ext" ] )
+ break
+ except Exception:
+ # Sniffer raised exception, could be any number of
+ # reasons for this so there is not much to do besides
+ # trying next sniffer.
+ pass
+ return format_information
@staticmethod
def is_ext_unsniffable(ext):
https://bitbucket.org/galaxy/galaxy-central/commits/94aea2327373/
Changeset: 94aea2327373
Branch: next-stable
User: jmchilton
Date: 2013-04-23 19:24:16
Summary: Rework checking binary files. Why read in line by line if only going to check 100 characters? This approach to just read first 100 characters is cleaner, more efficient, and hopefully less error prone. Should avoid the following exception caused when checking large files:
Traceback (most recent call last):
File "/opt/galaxy/web/tools/data_source/upload.py", line 432, in <module>
__main__()
File "/opt/galaxy/web/tools/data_source/upload.py", line 421, in __main__
add_file( dataset, registry, json_file, output_path )
File "/opt/galaxy/web/tools/data_source/upload.py", line 283, in add_file
if check_binary( dataset.path ):
File "/opt/galaxy/web/lib/galaxy/datatypes/checkers.py", line 58, in check_binary
for chars in temp:
SystemError: Negative size passed to PyString_FromStringAndSize
Affected #: 1 file
diff -r 21a13e4865b3f5d997dd34bc977eb5f998b43024 -r 94aea2327373d4cb651e3db3fe5113a5b0c669f5 lib/galaxy/datatypes/checkers.py
--- a/lib/galaxy/datatypes/checkers.py
+++ b/lib/galaxy/datatypes/checkers.py
@@ -1,5 +1,6 @@
import os, gzip, re, gzip, zipfile, binascii, bz2, imghdr
from galaxy import util
+from StringIO import StringIO
try:
import Image as PIL
@@ -53,20 +54,15 @@
if file_path:
temp = open( name, "U" )
else:
- temp = name
+ temp = StringIO( name )
chars_read = 0
- for chars in temp:
- for char in chars:
- chars_read += 1
+ try:
+ for char in temp.read( 100 ):
if util.is_binary( char ):
is_binary = True
break
- if chars_read > 100:
- break
- if chars_read > 100:
- break
- if file_path:
- temp.close()
+ finally:
+ temp.close( )
return is_binary
def check_gzip( file_path ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0