galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
January 2013
- 1 participants
- 160 discussions
commit/galaxy-central: greg: Automatically create missing required tool shed files from samples when startign the tool shed server.
by Bitbucket 31 Jan '13
by Bitbucket 31 Jan '13
31 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/2a34f751c32c/
changeset: 2a34f751c32c
user: greg
date: 2013-01-31 17:47:32
summary: Automatically create missing required tool shed files from samples when startign the tool shed server.
affected #: 1 file
diff -r 5e60f2efceafaec9973ed1aa312cd66a0a11b3e8 -r 2a34f751c32cb9b4ab271d1f14c99406b9d04a54 run_community.sh
--- a/run_community.sh
+++ b/run_community.sh
@@ -1,4 +1,37 @@
#!/bin/sh
cd `dirname $0`
+
+SAMPLES="
+ community_wsgi.ini.sample
+ datatypes_conf.xml.sample
+ external_service_types_conf.xml.sample
+ migrated_tools_conf.xml.sample
+ reports_wsgi.ini.sample
+ shed_tool_conf.xml.sample
+ tool_conf.xml.sample
+ shed_tool_data_table_conf.xml.sample
+ tool_data_table_conf.xml.sample
+ tool_sheds_conf.xml.sample
+ openid_conf.xml.sample
+ universe_wsgi.ini.sample
+ tool-data/shared/ncbi/builds.txt.sample
+ tool-data/shared/ensembl/builds.txt.sample
+ tool-data/shared/ucsc/builds.txt.sample
+ tool-data/shared/ucsc/publicbuilds.txt.sample
+ tool-data/shared/igv/igv_build_sites.txt.sample
+ tool-data/shared/rviewer/rviewer_build_sites.txt.sample
+ tool-data/*.sample
+ static/welcome.html.sample
+"
+
+# Create any missing config/location files
+for sample in $SAMPLES; do
+ file=`echo $sample | sed -e 's/\.sample$//'`
+ if [ ! -f "$file" -a -f "$sample" ]; then
+ echo "Initializing $file from `basename $sample`"
+ cp $sample $file
+ fi
+done
+
python ./scripts/paster.py serve community_wsgi.ini --pid-file=community_webapp.pid --log-file=community_webapp.log $@
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Uncomment the default sqlite database connection string in community_wsgi.ini.sample.
by Bitbucket 31 Jan '13
by Bitbucket 31 Jan '13
31 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/5e60f2efceaf/
changeset: 5e60f2efceaf
user: greg
date: 2013-01-31 17:39:12
summary: Uncomment the default sqlite database connection string in community_wsgi.ini.sample.
affected #: 1 file
diff -r f48d3ebce23e422d9bf9d87851460ba0976a5094 -r 5e60f2efceafaec9973ed1aa312cd66a0a11b3e8 community_wsgi.ini.sample
--- a/community_wsgi.ini.sample
+++ b/community_wsgi.ini.sample
@@ -22,7 +22,7 @@
log_level = DEBUG
# Database connection
-#database_file = database/community.sqlite
+database_file = database/community.sqlite
# You may use a SQLAlchemy connection string to specify an external database instead
#database_connection = postgres:///community_test?host=/var/run/postgresql
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
31 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/f48d3ebce23e/
changeset: f48d3ebce23e
user: greg
date: 2013-01-31 17:33:15
summary: Allow a Galaxy admin to select a shed-related tool panel config file whose tool_path setting will be used to installed repositories into Galaxy for those repositories with which a tool panel selection is not necessary.
affected #: 4 files
diff -r 1605621deee25ec1ae8e7a72f4c1dc4f257f6760 -r f48d3ebce23e422d9bf9d87851460ba0976a5094 lib/galaxy/util/shed_util_common.py
--- a/lib/galaxy/util/shed_util_common.py
+++ b/lib/galaxy/util/shed_util_common.py
@@ -2088,10 +2088,7 @@
label = "-1:%s" % changeset_revision
return rev, label
def get_shed_tool_conf_dict( app, shed_tool_conf ):
- """
- Return the in-memory version of the shed_tool_conf file, which is stored in the config_elems entry
- in the shed_tool_conf_dict associated with the file.
- """
+ """Return the in-memory version of the shed_tool_conf file, which is stored in the config_elems entry in the shed_tool_conf_dict associated with the file."""
for index, shed_tool_conf_dict in enumerate( app.toolbox.shed_tool_confs ):
if shed_tool_conf == shed_tool_conf_dict[ 'config_filename' ]:
return index, shed_tool_conf_dict
diff -r 1605621deee25ec1ae8e7a72f4c1dc4f257f6760 -r f48d3ebce23e422d9bf9d87851460ba0976a5094 lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -1199,6 +1199,7 @@
return trans.show_error_message( message )
message = kwd.get( 'message', '' )
status = kwd.get( 'status', 'done' )
+ shed_tool_conf = kwd.get( 'shed_tool_conf', None )
tool_shed_url = kwd[ 'tool_shed_url' ]
# Handle repository dependencies.
has_repository_dependencies = util.string_as_bool( kwd.get( 'has_repository_dependencies', False ) )
@@ -1229,16 +1230,9 @@
includes_tool_dependencies = util.string_as_bool( repo_information_dict.get( 'includes_tool_dependencies', False ) )
encoded_repo_info_dicts = util.listify( repo_information_dict.get( 'repo_info_dicts', [] ) )
repo_info_dicts = [ encoding_util.tool_shed_decode( encoded_repo_info_dict ) for encoded_repo_info_dict in encoded_repo_info_dicts ]
- if ( not includes_tools and not has_repository_dependencies ) or \
+ if ( ( not includes_tools and not has_repository_dependencies ) and kwd.get( 'select_shed_tool_panel_config_button', False ) ) or \
( ( includes_tools or has_repository_dependencies ) and kwd.get( 'select_tool_panel_section_button', False ) ):
install_repository_dependencies = CheckboxField.is_checked( install_repository_dependencies )
- if includes_tools:
- shed_tool_conf = kwd[ 'shed_tool_conf' ]
- else:
- # If installing a repository that includes no tools, get the relative tool_path from the file to which the migrated_tools_config
- # setting points.
- # FIXME: default to a temporary file or raise an exception or display an appropriate error message.
- shed_tool_conf = trans.app.config.migrated_tools_config
if includes_tool_dependencies:
install_tool_dependencies = CheckboxField.is_checked( install_tool_dependencies )
else:
@@ -1327,14 +1321,7 @@
return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
action='manage_repositories',
**kwd ) )
- if len( trans.app.toolbox.shed_tool_confs ) > 1:
- shed_tool_conf_select_field = build_shed_tool_conf_select_field( trans )
- shed_tool_conf = None
- else:
- shed_tool_conf_dict = trans.app.toolbox.shed_tool_confs[0]
- shed_tool_conf = shed_tool_conf_dict[ 'config_filename' ]
- shed_tool_conf = shed_tool_conf.replace( './', '', 1 )
- shed_tool_conf_select_field = None
+ shed_tool_conf_select_field = build_shed_tool_conf_select_field( trans )
tool_path = suc.get_tool_path_by_shed_tool_conf_filename( trans, shed_tool_conf )
tool_panel_section_select_field = build_tool_panel_section_select_field( trans )
if len( repo_info_dicts ) == 1:
@@ -1387,21 +1374,40 @@
install_tool_dependencies_check_box = CheckboxField( 'install_tool_dependencies', checked=install_tool_dependencies_check_box_checked )
# Handle repository dependencies check box.
install_repository_dependencies_check_box = CheckboxField( 'install_repository_dependencies', checked=True )
- return trans.fill_template( '/admin/tool_shed_repository/select_tool_panel_section.mako',
- encoded_repo_info_dicts=encoded_repo_info_dicts,
- includes_tools=includes_tools,
- includes_tool_dependencies=includes_tool_dependencies,
- install_tool_dependencies_check_box=install_tool_dependencies_check_box,
- has_repository_dependencies=has_repository_dependencies,
- install_repository_dependencies_check_box=install_repository_dependencies_check_box,
- new_tool_panel_section=new_tool_panel_section,
- containers_dict=containers_dict,
- shed_tool_conf=shed_tool_conf,
- shed_tool_conf_select_field=shed_tool_conf_select_field,
- tool_panel_section_select_field=tool_panel_section_select_field,
- tool_shed_url=kwd[ 'tool_shed_url' ],
- message=message,
- status=status )
+ if includes_tools or has_repository_dependencies:
+ return trans.fill_template( '/admin/tool_shed_repository/select_tool_panel_section.mako',
+ encoded_repo_info_dicts=encoded_repo_info_dicts,
+ includes_tools=includes_tools,
+ includes_tool_dependencies=includes_tool_dependencies,
+ install_tool_dependencies_check_box=install_tool_dependencies_check_box,
+ has_repository_dependencies=has_repository_dependencies,
+ install_repository_dependencies_check_box=install_repository_dependencies_check_box,
+ new_tool_panel_section=new_tool_panel_section,
+ containers_dict=containers_dict,
+ shed_tool_conf=shed_tool_conf,
+ shed_tool_conf_select_field=shed_tool_conf_select_field,
+ tool_panel_section_select_field=tool_panel_section_select_field,
+ tool_shed_url=kwd[ 'tool_shed_url' ],
+ message=message,
+ status=status )
+ else:
+ # If installing repositories that includes no tools and has not repository dependencies, display a page allowing the Galaxy administrator to
+ # select a shed-related tool panel configuration file whose tool_path setting will be the location the repositories will be installed.
+ return trans.fill_template( '/admin/tool_shed_repository/select_shed_tool_panel_config.mako',
+ encoded_repo_info_dicts=encoded_repo_info_dicts,
+ includes_tools=includes_tools,
+ includes_tool_dependencies=includes_tool_dependencies,
+ install_tool_dependencies_check_box=install_tool_dependencies_check_box,
+ has_repository_dependencies=has_repository_dependencies,
+ install_repository_dependencies_check_box=install_repository_dependencies_check_box,
+ new_tool_panel_section=new_tool_panel_section,
+ containers_dict=containers_dict,
+ shed_tool_conf=shed_tool_conf,
+ shed_tool_conf_select_field=shed_tool_conf_select_field,
+ tool_panel_section_select_field=tool_panel_section_select_field,
+ tool_shed_url=kwd[ 'tool_shed_url' ],
+ message=message,
+ status=status )
@web.expose
@web.require_admin
def reinstall_repository( self, trans, **kwd ):
diff -r 1605621deee25ec1ae8e7a72f4c1dc4f257f6760 -r f48d3ebce23e422d9bf9d87851460ba0976a5094 templates/admin/tool_shed_repository/select_shed_tool_panel_config.mako
--- /dev/null
+++ b/templates/admin/tool_shed_repository/select_shed_tool_panel_config.mako
@@ -0,0 +1,114 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+<%namespace file="/admin/tool_shed_repository/common.mako" import="render_dependencies_section" />
+<%namespace file="/admin/tool_shed_repository/common.mako" import="render_readme_section" />
+<%namespace file="/webapps/community/repository/common.mako" import="*" />
+
+<%def name="stylesheets()">
+ ${parent.stylesheets()}
+ ${h.css( "library" )}
+</%def>
+
+<%def name="javascripts()">
+ ${parent.javascripts()}
+ ${h.js("libs/jquery/jquery.rating", "libs/jquery/jstorage" )}
+ ${container_javascripts()}
+</%def>
+
+<%
+ # Handle the case where an uninstalled repository encountered errors during the process of being reinstalled. In
+ # this case, the repository metadata is an empty dictionary, but one or both of has_repository_dependencies
+ # and includes_tool_dependencies may be True. If either of these are True but we have no metadata, we cannot install
+ # repository dependencies on this pass.
+ if has_repository_dependencies:
+ repository_dependencies = containers_dict[ 'repository_dependencies' ]
+ missing_repository_dependencies = containers_dict[ 'missing_repository_dependencies' ]
+ if repository_dependencies or missing_repository_dependencies:
+ can_display_repository_dependencies = True
+ else:
+ can_display_repository_dependencies = False
+ else:
+ can_display_repository_dependencies = False
+ if includes_tool_dependencies:
+ tool_dependencies = containers_dict[ 'tool_dependencies' ]
+ missing_tool_dependencies = containers_dict[ 'missing_tool_dependencies' ]
+ if tool_dependencies or missing_tool_dependencies:
+ can_display_tool_dependencies = True
+ else:
+ can_display_tool_dependencies = False
+ else:
+ can_display_tool_dependencies = False
+%>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<div class="warningmessage">
+ <p>
+ The core Galaxy development team does not maintain the contents of many Galaxy tool shed repositories. Some repository tools
+ may include code that produces malicious behavior, so be aware of what you are installing.
+ </p>
+ <p>
+ If you discover a repository that causes problems after installation, contact <a href="http://wiki.g2.bx.psu.edu/Support" target="_blank">Galaxy support</a>,
+ sending all necessary information, and appropriate action will be taken.
+ </p>
+ <p>
+ <a href="http://wiki.g2.bx.psu.edu/Tool%20Shed#Contacting_the_owner_of_a_repository" target="_blank">Contact the repository owner</a> for
+ general questions or concerns.
+ </p>
+</div>
+<div class="toolForm">
+ <div class="toolFormBody">
+ <form name="select_shed_tool_panel_config" id="select_shed_tool_panel_config" action="${h.url_for( controller='admin_toolshed', action='prepare_for_install', tool_shed_url=tool_shed_url, encoded_repo_info_dicts=encoded_repo_info_dicts, includes_tools=includes_tools, includes_tool_dependencies=includes_tool_dependencies )}" method="post" >
+ <div style="clear: both"></div>
+ <% readme_files_dict = containers_dict.get( 'readme_files', None ) %>
+ %if readme_files_dict:
+ <div class="form-row">
+ <table class="colored" width="100%">
+ <th bgcolor="#EBD9B2">Repository README file - may contain important installation or license information</th>
+ </table>
+ </div>
+ ${render_readme_section( containers_dict )}
+ <div style="clear: both"></div>
+ %endif
+ %if can_display_repository_dependencies or can_display_tool_dependencies:
+ <div class="form-row">
+ <table class="colored" width="100%">
+ <th bgcolor="#EBD9B2">Confirm dependency installation</th>
+ </table>
+ </div>
+ ${render_dependencies_section( install_repository_dependencies_check_box, install_tool_dependencies_check_box, containers_dict )}
+ <div style="clear: both"></div>
+ %endif
+ <div class="form-row">
+ <table class="colored" width="100%">
+ <th bgcolor="#EBD9B2">Choose the configuration file whose tool_path setting will be used for installing repositories</th>
+ </table>
+ </div>
+ %if shed_tool_conf_select_field:
+ <%
+ if len( shed_tool_conf_select_field.options ) == 1:
+ select_help = "Your Galaxy instance is configured with 1 shed-related tool configuration file, so repositories will be "
+ select_help += "installed using it's <b>tool_path</b> setting."
+ else:
+ select_help = "Your Galaxy instance is configured with %d shed-related tool configuration files, " % len( shed_tool_conf_select_field.options )
+ select_help += "so select the file whose <b>tool_path</b> setting you want used for installing repositories."
+ %>
+ <div class="form-row">
+ <label>Shed tool configuration file:</label>
+ ${shed_tool_conf_select_field.get_html()}
+ <div class="toolParamHelp" style="clear: both;">
+ ${select_help}
+ </div>
+ </div>
+ <div style="clear: both"></div>
+ %else:
+ <input type="hidden" name="shed_tool_conf" value="${shed_tool_conf}"/>
+ %endif
+ <div class="form-row">
+ <input type="submit" name="select_shed_tool_panel_config_button" value="Install"/>
+ </div>
+ </form>
+ </div>
+</div>
diff -r 1605621deee25ec1ae8e7a72f4c1dc4f257f6760 -r f48d3ebce23e422d9bf9d87851460ba0976a5094 templates/admin/tool_shed_repository/select_tool_panel_section.mako
--- a/templates/admin/tool_shed_repository/select_tool_panel_section.mako
+++ b/templates/admin/tool_shed_repository/select_tool_panel_section.mako
@@ -87,12 +87,19 @@
</table></div>
%if shed_tool_conf_select_field:
+ <%
+ if len( shed_tool_conf_select_field.options ) == 1:
+ select_help = "Your Galaxy instance is configured with 1 shed-related tool configuration file, so repositories will be "
+ select_help += "installed using it's <b>tool_path</b> setting."
+ else:
+ select_help = "Your Galaxy instance is configured with %d shed-related tool configuration files, " % len( shed_tool_conf_select_field.options )
+ select_help += "so select the file whose <b>tool_path</b> setting you want used for installing repositories."
+ %><div class="form-row"><label>Shed tool configuration file:</label>
${shed_tool_conf_select_field.get_html()}
<div class="toolParamHelp" style="clear: both;">
- Your Galaxy instance is configured with ${len( shed_tool_conf_select_field.options )} shed tool configuration files,
- so choose one in which to configure the installed tools.
+ ${select_help}
</div></div><div style="clear: both"></div>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Restrict the ability to use reserved words in the tool shed for repository names and public user names.
by Bitbucket 31 Jan '13
by Bitbucket 31 Jan '13
31 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/1605621deee2/
changeset: 1605621deee2
user: greg
date: 2013-01-31 15:28:40
summary: Restrict the ability to use reserved words in the tool shed for repository names and public user names.
affected #: 2 files
diff -r 04e22199687138a539734d108bc711fe96ab1083 -r 1605621deee25ec1ae8e7a72f4c1dc4f257f6760 lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -951,7 +951,9 @@
if not description:
message = 'Enter a description.'
error = True
- if not error:
+ if error:
+ status = 'error'
+ else:
# Add the repository record to the db
repository = trans.app.model.Repository( name=name,
description=description,
@@ -2458,15 +2460,17 @@
# in length and must contain only lower-case letters, numbers, and the '_' character.
if name in [ 'None', None, '' ]:
return 'Enter the required repository name.'
+ if name in [ 'repos' ]:
+ return "The term <b>%s</b> is a reserved word in the tool shed, so it cannot be used as a repository name." % name
for repository in user.active_repositories:
if repository.name == name:
- return "You already have a repository named '%s', so choose a different name." % name
+ return "You already have a repository named <b>%s</b>, so choose a different name." % name
if len( name ) < 4:
return "Repository names must be at least 4 characters in length."
if len( name ) > 80:
return "Repository names cannot be more than 80 characters in length."
if not( VALID_REPOSITORYNAME_RE.match( name ) ):
- return "Repository names must contain only lower-case letters, numbers and underscore '_'."
+ return "Repository names must contain only lower-case letters, numbers and underscore <b>_</b>."
return ''
@web.expose
def view_changelog( self, trans, id, **kwd ):
diff -r 04e22199687138a539734d108bc711fe96ab1083 -r 1605621deee25ec1ae8e7a72f4c1dc4f257f6760 lib/galaxy/webapps/galaxy/controllers/user.py
--- a/lib/galaxy/webapps/galaxy/controllers/user.py
+++ b/lib/galaxy/webapps/galaxy/controllers/user.py
@@ -930,8 +930,11 @@
status=status )
def __validate( self, trans, params, email, password, confirm, username ):
# If coming from the community webapp, we'll require a public user name
- if trans.webapp.name == 'community' and not username:
- return "A public user name is required"
+ if trans.webapp.name == 'community':
+ if not username:
+ return "A public user name is required in the tool shed."
+ if username in [ 'repos' ]:
+ return "The term <b>%s</b> is a reserved word in the tool shed, so it cannot be used as a public user name." % username
message = validate_email( trans, email )
if not message:
message = validate_password( trans, password, confirm )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Dont' assume a complex repository dependency was properly defined.
by Bitbucket 30 Jan '13
by Bitbucket 30 Jan '13
30 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/04e221996871/
changeset: 04e221996871
user: greg
date: 2013-01-30 22:47:20
summary: Dont' assume a complex repository dependency was properly defined.
affected #: 1 file
diff -r ea3da2000fa733a2d3ff5d5629dec58d3573645c -r 04e22199687138a539734d108bc711fe96ab1083 lib/galaxy/util/shed_util_common.py
--- a/lib/galaxy/util/shed_util_common.py
+++ b/lib/galaxy/util/shed_util_common.py
@@ -1223,7 +1223,8 @@
current_rd_tups, error_message = handle_repository_elem( app=app,
repository_elem=sub_elem,
repository_dependencies_tups=None )
- repository_dependency_tup = current_rd_tups[ 0 ]
+ if current_rd_tups:
+ repository_dependency_tup = current_rd_tups[ 0 ]
if requirements_dict:
dependency_key = '%s/%s' % ( package_name, package_version )
tool_dependencies_dict[ dependency_key ] = requirements_dict
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Support for installation and administration of complex repository dependencies in Galaxy.
by Bitbucket 30 Jan '13
by Bitbucket 30 Jan '13
30 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/ea3da2000fa7/
changeset: ea3da2000fa7
user: greg
date: 2013-01-30 22:29:37
summary: Support for installation and administration of complex repository dependencies in Galaxy.
affected #: 7 files
diff -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 -r ea3da2000fa733a2d3ff5d5629dec58d3573645c lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -3153,6 +3153,11 @@
def can_reinstall_or_activate( self ):
return self.deleted
@property
+ def has_readme_files( self ):
+ if self.metadata:
+ return 'readme_files' in self.metadata
+ return False
+ @property
def has_repository_dependencies( self ):
if self.metadata:
return 'repository_dependencies' in self.metadata
@@ -3176,11 +3181,6 @@
def in_error_state( self ):
return self.status == self.installation_status.ERROR
@property
- def has_readme_files( self ):
- if self.metadata:
- return 'readme_files' in self.metadata
- return False
- @property
def repository_dependencies( self ):
required_repositories = []
for rrda in self.required_repositories:
diff -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 -r ea3da2000fa733a2d3ff5d5629dec58d3573645c lib/galaxy/tool_shed/tool_dependencies/fabric_util.py
--- a/lib/galaxy/tool_shed/tool_dependencies/fabric_util.py
+++ b/lib/galaxy/tool_shed/tool_dependencies/fabric_util.py
@@ -50,6 +50,7 @@
install_dir = actions_dict[ 'install_dir' ]
package_name = actions_dict[ 'package_name' ]
actions = actions_dict.get( 'actions', None )
+ filtered_actions = []
if actions:
with make_tmp_dir() as work_dir:
with lcd( work_dir ):
@@ -57,6 +58,8 @@
# are currently only two supported processes; download_by_url and clone via a "shell_command" action type.
action_type, action_dict = actions[ 0 ]
if action_type == 'download_by_url':
+ # Eliminate the download_by_url action so remaining actions can be processed correctly.
+ filtered_actions = actions[ 1: ]
url = action_dict[ 'url' ]
if 'target_filename' in action_dict:
downloaded_filename = action_dict[ 'target_filename' ]
@@ -75,15 +78,24 @@
dir = work_dir
elif action_type == 'shell_command':
# <action type="shell_command">git clone --recursive git://github.com/ekg/freebayes.git</action>
+ # Eliminate the shell_command clone action so remaining actions can be processed correctly.
+ filtered_actions = actions[ 1: ]
return_code = handle_command( app, tool_dependency, install_dir, action_dict[ 'command' ] )
if return_code:
return
dir = package_name
+ else:
+ # We're handling a complex repository dependency where we only have a set_environment tag set.
+ # <action type="set_environment">
+ # <environment_variable name="PATH" action="prepend_to">$INSTALL_DIR/bin</environment_variable>
+ # </action>
+ filtered_actions = [ a for a in actions ]
+ dir = install_dir
if not os.path.exists( dir ):
os.makedirs( dir )
# The package has been down-loaded, so we can now perform all of the actions defined for building it.
with lcd( dir ):
- for action_tup in actions[ 1: ]:
+ for action_tup in filtered_actions:
action_type, action_dict = action_tup
current_dir = os.path.abspath( os.path.join( work_dir, dir ) )
if action_type == 'make_directory':
@@ -93,6 +105,8 @@
source_dir=os.path.join( action_dict[ 'source_directory' ] ),
destination_dir=os.path.join( action_dict[ 'destination_directory' ] ) )
elif action_type == 'move_file':
+ # TODO: Remove this hack that resets current_dir so that the pre-compiled bwa binary can be found.
+ # current_dir = '/Users/gvk/workspaces_2008/bwa/bwa-0.5.9'
common_util.move_file( current_dir=current_dir,
source=os.path.join( action_dict[ 'source' ] ),
destination_dir=os.path.join( action_dict[ 'destination' ] ) )
diff -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 -r ea3da2000fa733a2d3ff5d5629dec58d3573645c lib/galaxy/tool_shed/tool_dependencies/install_util.py
--- a/lib/galaxy/tool_shed/tool_dependencies/install_util.py
+++ b/lib/galaxy/tool_shed/tool_dependencies/install_util.py
@@ -1,8 +1,9 @@
-import sys, os, subprocess, tempfile
+import sys, os, subprocess, tempfile, urllib2
import common_util
import fabric_util
from galaxy.tool_shed import encoding_util
from galaxy.model.orm import and_
+from galaxy.web import url_for
from galaxy import eggs
import pkg_resources
@@ -11,6 +12,9 @@
from elementtree import ElementTree, ElementInclude
from elementtree.ElementTree import Element, SubElement
+def clean_tool_shed_url( base_url ):
+ protocol, base = base_url.split( '://' )
+ return base.rstrip( '/' )
def create_or_update_tool_dependency( app, tool_shed_repository, name, version, type, status, set_status=True ):
# Called from Galaxy (never the tool shed) when a new repository is being installed or when an uninstalled repository is being reinstalled.
sa_session = app.model.context.current
@@ -28,6 +32,64 @@
sa_session.add( tool_dependency )
sa_session.flush()
return tool_dependency
+def create_temporary_tool_dependencies_config( tool_shed_url, name, owner, changeset_revision ):
+ """Make a call to the tool shed to get the required repository's tool_dependencies.xml file."""
+ url = url_join( tool_shed_url,
+ 'repository/get_tool_dependencies_config_contents?name=%s&owner=%s&changeset_revision=%s' % \
+ ( name, owner, changeset_revision ) )
+ response = urllib2.urlopen( url )
+ text = response.read()
+ response.close()
+ if text:
+ # Write the contents to a temporary file on disk so it can be reloaded and parsed.
+ fh = tempfile.NamedTemporaryFile( 'wb' )
+ tmp_filename = fh.name
+ fh.close()
+ fh = open( tmp_filename, 'wb' )
+ fh.write( text )
+ fh.close()
+ return tmp_filename
+ else:
+ message = "Unable to retrieve required tool_dependencies.xml file from the tool shed for revision "
+ message += "%s of installed repository %s owned by %s." % ( str( changeset_revision ), str( name ), str( owner ) )
+ raise Exception( message )
+ return None
+def get_absolute_path_to_file_in_repository( repo_files_dir, file_name ):
+ """Return the absolute path to a specified disk file contained in a repository."""
+ stripped_file_name = strip_path( file_name )
+ file_path = None
+ for root, dirs, files in os.walk( repo_files_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name == stripped_file_name:
+ return os.path.abspath( os.path.join( root, name ) )
+ return file_path
+def get_tool_shed_repository_by_tool_shed_name_owner_changeset_revision( app, tool_shed_url, name, owner, changeset_revision ):
+ sa_session = app.model.context.current
+ tool_shed = clean_tool_shed_url( tool_shed_url )
+ tool_shed_repository = sa_session.query( app.model.ToolShedRepository ) \
+ .filter( and_( app.model.ToolShedRepository.table.c.tool_shed == tool_shed,
+ app.model.ToolShedRepository.table.c.name == name,
+ app.model.ToolShedRepository.table.c.owner == owner,
+ app.model.ToolShedRepository.table.c.changeset_revision == changeset_revision ) ) \
+ .first()
+ if tool_shed_repository:
+ return tool_shed_repository
+ # The tool_shed_repository must have been updated to a newer changeset revision than the one defined in the repository_dependencies.xml file,
+ # so call the tool shed to get all appropriate newer changeset revisions.
+ text = get_updated_changeset_revisions_from_tool_shed( tool_shed_url, name, owner, changeset_revision )
+ if text:
+ changeset_revisions = listify( text )
+ for changeset_revision in changeset_revisions:
+ tool_shed_repository = sa_session.query( app.model.ToolShedRepository ) \
+ .filter( and_( app.model.ToolShedRepository.table.c.tool_shed == tool_shed,
+ app.model.ToolShedRepository.table.c.name == name,
+ app.model.ToolShedRepository.table.c.owner == owner,
+ app.model.ToolShedRepository.table.c.changeset_revision == changeset_revision ) ) \
+ .first()
+ if tool_shed_repository:
+ return tool_shed_repository
+ return None
def get_tool_dependency_by_name_type_repository( app, repository, name, type ):
sa_session = app.model.context.current
return sa_session.query( app.model.ToolDependency ) \
@@ -43,23 +105,83 @@
app.model.ToolDependency.table.c.version == version,
app.model.ToolDependency.table.c.type == type ) ) \
.first()
-def get_tool_dependency_install_dir( app, repository, type, name, version ):
- if type == 'package':
+def get_tool_dependency_install_dir( app, repository_name, repository_owner, repository_changeset_revision, tool_dependency_type, tool_dependency_name,
+ tool_dependency_version ):
+ if tool_dependency_type == 'package':
return os.path.abspath( os.path.join( app.config.tool_dependency_dir,
- name,
- version,
- repository.owner,
- repository.name,
- repository.installed_changeset_revision ) )
- if type == 'set_environment':
+ tool_dependency_name,
+ tool_dependency_version,
+ repository_owner,
+ repository_name,
+ repository_changeset_revision ) )
+ if tool_dependency_type == 'set_environment':
return os.path.abspath( os.path.join( app.config.tool_dependency_dir,
'environment_settings',
- name,
- repository.owner,
- repository.name,
- repository.installed_changeset_revision ) )
+ tool_dependency_name,
+ repository_owner,
+ repository_name,
+ repository_changeset_revision ) )
def get_tool_shed_repository_install_dir( app, tool_shed_repository ):
return os.path.abspath( tool_shed_repository.repo_files_directory( app ) )
+def get_updated_changeset_revisions_from_tool_shed( tool_shed_url, name, owner, changeset_revision ):
+ """Get all appropriate newer changeset revisions for the repository defined by the received tool_shed_url / name / owner combination."""
+ url = url_join( tool_shed_url,
+ 'repository/updated_changeset_revisions?name=%s&owner=%s&changeset_revision=%s' % ( name, owner, changeset_revision ) )
+ response = urllib2.urlopen( url )
+ text = response.read()
+ response.close()
+ return text
+def handle_set_environment_entry_for_package( app, install_dir, tool_shed_repository, package_name, package_version, elem ):
+ action_dict = {}
+ actions = []
+ for package_elem in elem:
+ if package_elem.tag == 'install':
+ # Create the tool_dependency record in the database.
+ tool_dependency = create_or_update_tool_dependency( app=app,
+ tool_shed_repository=tool_shed_repository,
+ name=package_name,
+ version=package_version,
+ type='package',
+ status=app.model.ToolDependency.installation_status.INSTALLING,
+ set_status=True )
+ # Get the installation method version from a tag like: <install version="1.0">
+ package_install_version = package_elem.get( 'version', '1.0' )
+ if package_install_version == '1.0':
+ # Since the required tool dependency is installed for a repository dependency, all we need to do
+ # is inspect the <actions> tag set to find the <action type="set_environment"> tag.
+ for actions_elem in package_elem:
+ for action_elem in actions_elem:
+ action_type = action_elem.get( 'type', 'shell_command' )
+ if action_type == 'set_environment':
+ # <action type="set_environment">
+ # <environment_variable name="PYTHONPATH" action="append_to">$INSTALL_DIR/lib/python</environment_variable>
+ # <environment_variable name="PATH" action="prepend_to">$INSTALL_DIR/bin</environment_variable>
+ # </action>
+ env_var_dicts = []
+ for env_elem in action_elem:
+ if env_elem.tag == 'environment_variable':
+ env_var_dict = common_util.create_env_var_dict( env_elem, tool_dependency_install_dir=install_dir )
+ if env_var_dict:
+ env_var_dicts.append( env_var_dict )
+ if env_var_dicts:
+ action_dict[ env_elem.tag ] = env_var_dicts
+ actions.append( ( action_type, action_dict ) )
+ return tool_dependency, actions
+ return None, actions
+def install_and_build_package_via_fabric( app, tool_dependency, actions_dict ):
+ sa_session = app.model.context.current
+ try:
+ # There is currently only one fabric method.
+ fabric_util.install_and_build_package( app, tool_dependency, actions_dict )
+ except Exception, e:
+ tool_dependency.status = app.model.ToolDependency.installation_status.ERROR
+ tool_dependency.error_message = str( e )
+ sa_session.add( tool_dependency )
+ sa_session.flush()
+ if tool_dependency.status != app.model.ToolDependency.installation_status.ERROR:
+ tool_dependency.status = app.model.ToolDependency.installation_status.INSTALLED
+ sa_session.add( tool_dependency )
+ sa_session.flush()
def install_package( app, elem, tool_shed_repository, tool_dependencies=None ):
# The value of tool_dependencies is a partial or full list of ToolDependency records associated with the tool_shed_repository.
sa_session = app.model.context.current
@@ -69,18 +191,101 @@
package_version = elem.get( 'version', None )
if package_name and package_version:
if tool_dependencies:
- install_dir = get_tool_dependency_install_dir( app,
- repository=tool_shed_repository,
- type='package',
- name=package_name,
- version=package_version )
+ # Get the installation directory for tool dependencies that will be installed for the received tool_shed_repository.
+ install_dir = get_tool_dependency_install_dir( app=app,
+ repository_name=tool_shed_repository.name,
+ repository_owner=tool_shed_repository.owner,
+ repository_changeset_revision=tool_shed_repository.installed_changeset_revision,
+ tool_dependency_type='package',
+ tool_dependency_name=package_name,
+ tool_dependency_version=package_version )
if not os.path.exists( install_dir ):
for package_elem in elem:
- if package_elem.tag == 'install':
+ if package_elem.tag == 'repository':
+ # We have a complex repository dependency definition.
+ tool_shed = package_elem.attrib[ 'toolshed' ]
+ required_repository_name = package_elem.attrib[ 'name' ]
+ required_repository_owner = package_elem.attrib[ 'owner' ]
+ required_repository_changeset_revision = package_elem.attrib[ 'changeset_revision' ]
+ required_repository = get_tool_shed_repository_by_tool_shed_name_owner_changeset_revision( app,
+ tool_shed,
+ required_repository_name,
+ required_repository_owner,
+ required_repository_changeset_revision )
+ tmp_filename = None
+ if required_repository:
+ # Set this repository's tool dependency env.sh file with a path to the required repository's installed tool dependency package.
+ # We can get everything we need from the discovered installed required_repository.
+ if required_repository.status in [ app.model.ToolShedRepository.installation_status.DEACTIVATED,
+ app.model.ToolShedRepository.installation_status.INSTALLED ]:
+ # Define the installation directory for the required tool dependency in the required repository.
+ required_repository_package_install_dir = \
+ get_tool_dependency_install_dir( app=app,
+ repository_name=required_repository.name,
+ repository_owner=required_repository.owner,
+ repository_changeset_revision=required_repository.installed_changeset_revision,
+ tool_dependency_type='package',
+ tool_dependency_name=package_name,
+ tool_dependency_version=package_version )
+ assert os.path.exists( required_repository_package_install_dir ), \
+ 'Missing required tool dependency directory %s' % str( required_repository_package_install_dir )
+ repo_files_dir = required_repository.repo_files_directory( app )
+ tool_dependencies_config = get_absolute_path_to_file_in_repository( repo_files_dir, 'tool_dependencies.xml' )
+ if tool_dependencies_config:
+ config_to_use = tool_dependencies_config
+ else:
+ message = "Unable to locate required tool_dependencies.xml file for revision %s of installed repository %s owned by %s." % \
+ ( str( required_repository.changeset_revision ), str( required_repository.name ), str( required_repository.owner ) )
+ raise Exception( message )
+ else:
+ # Make a call to the tool shed to get the changeset revision to which the current value of required_repository_changeset_revision
+ # should be updated if it's not current.
+ text = get_updated_changeset_revisions_from_tool_shed( tool_shed_url=tool_shed,
+ name=required_repository_name,
+ owner=required_repository_owner,
+ changeset_revision=required_repository_changeset_revision )
+ if text:
+ updated_changeset_revisions = listify( text )
+ # The list of changeset revisions is in reverse order, so the newest will be first.
+ required_repository_changeset_revision = updated_changeset_revisions[ 0 ]
+ # Define the installation directory for the required tool dependency in the required repository.
+ required_repository_package_install_dir = \
+ get_tool_dependency_install_dir( app=app,
+ repository_name=required_repository_name,
+ repository_owner=required_repository_owner,
+ repository_changeset_revision=required_repository_changeset_revision,
+ tool_dependency_type='package',
+ tool_dependency_name=package_name,
+ tool_dependency_version=package_version )
+ # Make a call to the tool shed to get the required repository's tool_dependencies.xml file.
+ tmp_filename = create_temporary_tool_dependencies_config( tool_shed,
+ required_repository_name,
+ required_repository_owner,
+ required_repository_changeset_revision )
+ config_to_use = tmp_filename
+ tool_dependency, actions_dict = populate_actions_dict( app=app,
+ dependent_install_dir=install_dir,
+ required_install_dir=required_repository_package_install_dir,
+ tool_shed_repository=tool_shed_repository,
+ package_name=package_name,
+ package_version=package_version,
+ tool_dependencies_config=config_to_use )
+ if tmp_filename:
+ try:
+ os.remove( tmp_filename )
+ except:
+ pass
+ # Install and build the package via fabric.
+ install_and_build_package_via_fabric( app, tool_dependency, actions_dict )
+ else:
+ message = "Unable to locate required tool shed repository named %s owned by %s with revision %s." % \
+ ( str( name ), str( owner ), str( changeset_revision ) )
+ raise Exception( message )
+ elif package_elem.tag == 'install':
# <install version="1.0">
package_install_version = package_elem.get( 'version', '1.0' )
- tool_dependency = create_or_update_tool_dependency( app,
- tool_shed_repository,
+ tool_dependency = create_or_update_tool_dependency( app=app,
+ tool_shed_repository=tool_shed_repository,
name=package_name,
version=package_version,
type='package',
@@ -168,7 +373,7 @@
if env_elem.tag == 'environment_variable':
env_var_dict = common_util.create_env_var_dict( env_elem, tool_dependency_install_dir=install_dir )
if env_var_dict:
- env_var_dicts.append( env_var_dict )
+ env_var_dicts.append( env_var_dict )
if env_var_dicts:
action_dict[ env_elem.tag ] = env_var_dicts
else:
@@ -183,18 +388,56 @@
# run_proprietary_fabric_method( app, elem, proprietary_fabfile_path, install_dir, package_name=package_name )
raise Exception( 'Tool dependency installation using proprietary fabric scripts is not yet supported.' )
else:
- try:
- # There is currently only one fabric method.
- fabric_util.install_and_build_package( app, tool_dependency, actions_dict )
- except Exception, e:
- tool_dependency.status = app.model.ToolDependency.installation_status.ERROR
- tool_dependency.error_message = str( e )
- sa_session.add( tool_dependency )
- sa_session.flush()
- if tool_dependency.status != app.model.ToolDependency.installation_status.ERROR:
- tool_dependency.status = app.model.ToolDependency.installation_status.INSTALLED
- sa_session.add( tool_dependency )
- sa_session.flush()
+ install_and_build_package_via_fabric( app, tool_dependency, actions_dict )
+def listify( item ):
+ """
+ Make a single item a single item list, or return a list if passed a
+ list. Passing a None returns an empty list.
+ """
+ if not item:
+ return []
+ elif isinstance( item, list ):
+ return item
+ elif isinstance( item, basestring ) and item.count( ',' ):
+ return item.split( ',' )
+ else:
+ return [ item ]
+def populate_actions_dict( app, dependent_install_dir, required_install_dir, tool_shed_repository, package_name, package_version, tool_dependencies_config ):
+ """
+ Populate an actions dictionary that can be sent to fabric_util.install_and_build_package. This method handles the scenario where a tool_dependencies.xml
+ file defines a complex repository dependency. In this case, the tool dependency package will be installed in a separate repository and the tool dependency
+ defined for the dependent repository will use an environment_variable setting defined in it's env.sh file to locate the required package. This method
+ basically does what the install_via_fabric method does, but restricts it's activity to the <action type="set_environment"> tag set within the required
+ repository's tool_dependencies.xml file.
+ """
+ sa_session = app.model.context.current
+ if not os.path.exists( dependent_install_dir ):
+ os.makedirs( dependent_install_dir )
+ actions_dict = dict( install_dir=dependent_install_dir )
+ if package_name:
+ actions_dict[ 'package_name' ] = package_name
+ tool_dependency = None
+ action_dict = {}
+ if tool_dependencies_config:
+ required_td_tree = parse_xml( tool_dependencies_config )
+ required_td_root = required_td_tree.getroot()
+ for required_td_elem in required_td_root:
+ # Find the appropriate package name and version.
+ if required_td_elem.tag == 'package':
+ # <package name="bwa" version="0.5.9">
+ required_td_package_name = required_td_elem.get( 'name', None )
+ required_td_package_version = required_td_elem.get( 'version', None )
+ if required_td_package_name==package_name and required_td_package_version==package_version:
+ tool_dependency, actions = handle_set_environment_entry_for_package( app=app,
+ install_dir=required_install_dir,
+ tool_shed_repository=tool_shed_repository,
+ package_name=package_name,
+ package_version=package_version,
+ elem=required_td_elem )
+ if actions:
+ actions_dict[ 'actions' ] = actions
+ break
+ return tool_dependency, actions_dict
def run_proprietary_fabric_method( app, elem, proprietary_fabfile_path, install_dir, package_name=None, **kwd ):
"""
TODO: Handle this using the fabric api.
@@ -248,6 +491,10 @@
tmp_stderr = open( tmp_name, 'rb' )
message = '%s\n' % str( tmp_stderr.read() )
tmp_stderr.close()
+ try:
+ os.remove( tmp_name )
+ except:
+ pass
return returncode, message
def set_environment( app, elem, tool_shed_repository ):
"""
@@ -258,6 +505,11 @@
<environment_variable name="R_SCRIPT_PATH" action="set_to">$REPOSITORY_INSTALL_DIR</environment_variable></set_environment>
"""
+ # TODO: Add support for a repository dependency definition within this tool dependency type's tag set. This should look something like
+ # the following. See the implementation of support for this in the tool dependency package type's method above.
+ # <set_environment version="1.0">
+ # <repository toolshed="<tool shed>" name="<repository name>" owner="<repository owner>" changeset_revision="<changeset revision>" />
+ # </set_environment>
sa_session = app.model.context.current
tool_dependency = None
env_var_version = elem.get( 'version', '1.0' )
@@ -267,18 +519,20 @@
env_var_name = env_var_elem.get( 'name', None )
env_var_action = env_var_elem.get( 'action', None )
if env_var_name and env_var_action:
- install_dir = get_tool_dependency_install_dir( app,
- repository=tool_shed_repository,
- type='set_environment',
- name=env_var_name,
- version=None )
+ install_dir = get_tool_dependency_install_dir( app=app,
+ repository_name=tool_shed_repository.name,
+ repository_owner=tool_shed_repository.owner,
+ repository_changeset_revision=tool_shed_repository.installed_changeset_revision,
+ tool_dependency_type='set_environment',
+ tool_dependency_name=env_var_name,
+ tool_dependency_version=None )
tool_shed_repository_install_dir = get_tool_shed_repository_install_dir( app, tool_shed_repository )
env_var_dict = common_util.create_env_var_dict( env_var_elem, tool_shed_repository_install_dir=tool_shed_repository_install_dir )
if env_var_dict:
if not os.path.exists( install_dir ):
os.makedirs( install_dir )
- tool_dependency = create_or_update_tool_dependency( app,
- tool_shed_repository,
+ tool_dependency = create_or_update_tool_dependency( app=app,
+ tool_shed_repository=tool_shed_repository,
name=env_var_name,
version=None,
type='set_environment',
@@ -294,3 +548,22 @@
sa_session.add( tool_dependency )
sa_session.flush()
print 'Environment variable ', env_var_name, 'set in', install_dir
+def strip_path( fpath ):
+ if not fpath:
+ return fpath
+ try:
+ file_path, file_name = os.path.split( fpath )
+ except:
+ file_name = fpath
+ return file_name
+def parse_xml( file_name ):
+ """Returns a parsed xml tree."""
+ tree = ElementTree.parse( file_name )
+ root = tree.getroot()
+ ElementInclude.include( root )
+ return tree
+def url_join( *args ):
+ parts = []
+ for arg in args:
+ parts.append( arg.strip( '/' ) )
+ return '/'.join( parts )
diff -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 -r ea3da2000fa733a2d3ff5d5629dec58d3573645c lib/galaxy/util/shed_util.py
--- a/lib/galaxy/util/shed_util.py
+++ b/lib/galaxy/util/shed_util.py
@@ -414,7 +414,7 @@
cleaned_repository_clone_url = suc.clean_repository_clone_url( repository_clone_url )
if not owner:
owner = get_repository_owner( cleaned_repository_clone_url )
- tool_shed = cleaned_repository_clone_url.split( 'repos' )[ 0 ].rstrip( '/' )
+ tool_shed = cleaned_repository_clone_url.split( '/repos/' )[ 0 ].rstrip( '/' )
for guid, tool_section_dicts in tool_panel_dict.items():
for tool_section_dict in tool_section_dicts:
tool_section = None
@@ -484,20 +484,6 @@
tool_section_dicts = generate_tool_section_dicts( tool_config=file_name, tool_sections=tool_sections )
tool_panel_dict[ guid ] = tool_section_dicts
return tool_panel_dict
-def generate_tool_path( repository_clone_url, changeset_revision ):
- """
- Generate a tool path that guarantees repositories with the same name will always be installed
- in different directories. The tool path will be of the form:
- <tool shed url>/repos/<repository owner>/<repository name>/<installed changeset revision>
- http://test@bx.psu.edu:9009/repos/test/filter
- """
- tmp_url = suc.clean_repository_clone_url( repository_clone_url )
- # Now tmp_url is something like: bx.psu.edu:9009/repos/some_username/column
- items = tmp_url.split( 'repos' )
- tool_shed_url = items[ 0 ]
- repo_path = items[ 1 ]
- tool_shed_url = suc.clean_tool_shed_url( tool_shed_url )
- return suc.url_join( tool_shed_url, 'repos', repo_path, changeset_revision )
def generate_tool_section_dicts( tool_config=None, tool_sections=None ):
tool_section_dicts = []
if tool_config is None:
@@ -529,6 +515,18 @@
else:
tool_section = None
return tool_section
+def generate_tool_shed_repository_install_dir( repository_clone_url, changeset_revision ):
+ """
+ Generate a repository installation directory that guarantees repositories with the same name will always be installed in different directories.
+ The tool path will be of the form: <tool shed url>/repos/<repository owner>/<repository name>/<installed changeset revision>
+ """
+ tmp_url = suc.clean_repository_clone_url( repository_clone_url )
+ # Now tmp_url is something like: bx.psu.edu:9009/repos/some_username/column
+ items = tmp_url.split( '/repos/' )
+ tool_shed_url = items[ 0 ]
+ repo_path = items[ 1 ]
+ tool_shed_url = suc.clean_tool_shed_url( tool_shed_url )
+ return suc.url_join( tool_shed_url, 'repos', repo_path, changeset_revision )
def get_config( config_file, repo, ctx, dir ):
"""Return the latest version of config_filename from the repository manifest."""
config_file = suc.strip_path( config_file )
@@ -821,14 +819,14 @@
readme_files_dict = json.from_json_string( raw_text )
return readme_files_dict
def get_repository_owner( cleaned_repository_url ):
- items = cleaned_repository_url.split( 'repos' )
+ items = cleaned_repository_url.split( '/repos/' )
repo_path = items[ 1 ]
if repo_path.startswith( '/' ):
repo_path = repo_path.replace( '/', '', 1 )
return repo_path.lstrip( '/' ).split( '/' )[ 0 ]
def get_repository_owner_from_clone_url( repository_clone_url ):
tmp_url = suc.clean_repository_clone_url( repository_clone_url )
- tool_shed = tmp_url.split( 'repos' )[ 0 ].rstrip( '/' )
+ tool_shed = tmp_url.split( '/repos/' )[ 0 ].rstrip( '/' )
return get_repository_owner( tmp_url )
def get_required_repo_info_dicts( tool_shed_url, repo_info_dicts ):
"""
diff -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 -r ea3da2000fa733a2d3ff5d5629dec58d3573645c lib/galaxy/util/shed_util_common.py
--- a/lib/galaxy/util/shed_util_common.py
+++ b/lib/galaxy/util/shed_util_common.py
@@ -1202,7 +1202,10 @@
app.config.tool_data_table_config_path = original_tool_data_table_config_path
return metadata_dict, invalid_file_tups
def generate_package_dependency_metadata( app, elem, tool_dependencies_dict ):
- """The value of package_name must match the value of the "package" type in the tool config's <requirements> tag set."""
+ """
+ Generate the metadata for a tool dependencies package defined for a repository. The value of package_name must match the value of the "package"
+ type in the tool config's <requirements> tag set. This method is called from both Galaxy and the tool shed.
+ """
repository_dependency_tup = []
requirements_dict = {}
error_message = ''
@@ -1217,9 +1220,9 @@
requirements_dict[ 'readme' ] = sub_elem.text
elif sub_elem.tag == 'repository':
# We have a complex repository dependency.
- current_rd_tups, error_message = handle_repository_elem_for_tool_shed( app=app,
- repository_elem=sub_elem,
- repository_dependencies_tups=None )
+ current_rd_tups, error_message = handle_repository_elem( app=app,
+ repository_elem=sub_elem,
+ repository_dependencies_tups=None )
repository_dependency_tup = current_rd_tups[ 0 ]
if requirements_dict:
dependency_key = '%s/%s' % ( package_name, package_version )
@@ -1274,7 +1277,7 @@
is_valid = False
if is_valid:
for repository_elem in root.findall( 'repository' ):
- current_rd_tups, error_message = handle_repository_elem_for_tool_shed( app, repository_elem, repository_dependencies_tups )
+ current_rd_tups, error_message = handle_repository_elem( app, repository_elem, repository_dependencies_tups )
if error_message:
log.debug( error_message )
return metadata_dict, error_message
@@ -1477,7 +1480,7 @@
metadata_dict[ 'workflows' ] = [ ( relative_path, exported_workflow_dict ) ]
return metadata_dict
def get_absolute_path_to_file_in_repository( repo_files_dir, file_name ):
- """Return the absolute path to a specified disk file containe in a repository."""
+ """Return the absolute path to a specified disk file contained in a repository."""
stripped_file_name = strip_path( file_name )
file_path = None
for root, dirs, files in os.walk( repo_files_dir ):
@@ -1677,8 +1680,8 @@
return None
def get_next_downloadable_changeset_revision( repository, repo, after_changeset_revision ):
"""
- Return the installable changeset_revision in the repository changelog after to the changeset to which after_changeset_revision
- refers. If there isn't one, return None.
+ Return the installable changeset_revision in the repository changelog after the changeset to which after_changeset_revision refers. If there
+ isn't one, return None.
"""
changeset_revisions = get_ordered_downloadable_changeset_revisions( repository, repo )
if len( changeset_revisions ) == 1:
@@ -2157,7 +2160,7 @@
.first()
def get_tool_shed_from_clone_url( repository_clone_url ):
tmp_url = clean_repository_clone_url( repository_clone_url )
- return tmp_url.split( 'repos' )[ 0 ].rstrip( '/' )
+ return tmp_url.split( '/repos/' )[ 0 ].rstrip( '/' )
def get_updated_changeset_revisions_for_repository_dependencies( trans, key_rd_dicts ):
updated_key_rd_dicts = []
for key_rd_dict in key_rd_dicts:
@@ -2411,6 +2414,67 @@
all_repository_dependencies=all_repository_dependencies,
handled_key_rd_dicts=handled_key_rd_dicts,
circular_repository_dependencies=circular_repository_dependencies )
+def handle_repository_elem( app, repository_elem, repository_dependencies_tups ):
+ """
+ Process the received repository_elem which is a <repository> tag either from a repository_dependencies.xml file or a tool_dependencies.xml file.
+ If the former, we're generating repository dependencies metadata for a repository in the tool shed. If the latter, we're generating package
+ dependency metadata with in Galaxy or the tool shed.
+ """
+ if repository_dependencies_tups is None:
+ new_rd_tups = []
+ else:
+ new_rd_tups = [ rdt for rdt in repository_dependencies_tups ]
+ error_message = ''
+ sa_session = app.model.context.current
+ toolshed = repository_elem.attrib[ 'toolshed' ]
+ name = repository_elem.attrib[ 'name' ]
+ owner = repository_elem.attrib[ 'owner' ]
+ changeset_revision = repository_elem.attrib[ 'changeset_revision' ]
+ user = None
+ repository = None
+ if app.name == 'galaxy':
+ # We're in Galaxy.
+ try:
+ repository = sa_session.query( app.model.ToolShedRepository ) \
+ .filter( and_( app.model.ToolShedRepository.table.c.name == name,
+ app.model.ToolShedRepository.table.c.owner == owner ) ) \
+ .first()
+ except:
+ error_message = "Invalid name %s or owner %s defined for repository. Repository dependencies will be ignored." % ( name, owner )
+ log.debug( error_message )
+ return new_rd_tups, error_message
+ repository_dependencies_tup = ( toolshed, name, owner, changeset_revision )
+ if repository_dependencies_tup not in new_rd_tups:
+ new_rd_tups.append( repository_dependencies_tup )
+ else:
+ # We're in the tool shed.
+ if tool_shed_is_this_tool_shed( toolshed ):
+ try:
+ user = sa_session.query( app.model.User ) \
+ .filter( app.model.User.table.c.username == owner ) \
+ .one()
+ except Exception, e:
+ error_message = "Invalid owner %s defined for repository %s. Repository dependencies will be ignored." % ( owner, name )
+ log.debug( error_message )
+ return new_rd_tups, error_message
+ try:
+ repository = sa_session.query( app.model.Repository ) \
+ .filter( and_( app.model.Repository.table.c.name == name,
+ app.model.Repository.table.c.user_id == user.id ) ) \
+ .first()
+ except:
+ error_message = "Invalid name %s or owner %s defined for repository. Repository dependencies will be ignored." % ( name, owner )
+ log.debug( error_message )
+ return new_rd_tups, error_message
+ repository_dependencies_tup = ( toolshed, name, owner, changeset_revision )
+ if repository_dependencies_tup not in new_rd_tups:
+ new_rd_tups.append( repository_dependencies_tup )
+ else:
+ # Repository dependencies are currentlhy supported within a single tool shed.
+ error_message = "Invalid tool shed %s defined for repository %s. " % ( toolshed, name )
+ error_message += "Repository dependencies are currently supported within a single tool shed, so your definition will be ignored."
+ log.debug( error_message )
+ return new_rd_tups, error_message
def handle_sample_files_and_load_tool_from_disk( trans, repo_files_dir, tool_config_filepath, work_dir ):
# Copy all sample files from disk to a temporary directory since the sample files may be in multiple directories.
message = ''
@@ -2487,54 +2551,6 @@
if is_orphan_in_tool_shed:
return True
return False
-def handle_repository_elem_for_tool_shed( app, repository_elem, repository_dependencies_tups ):
- if repository_dependencies_tups is None:
- new_rd_tups = []
- else:
- new_rd_tups = [ rdt for rdt in repository_dependencies_tups ]
- error_message = ''
- sa_session = app.model.context.current
- toolshed = repository_elem.attrib[ 'toolshed' ]
- name = repository_elem.attrib[ 'name' ]
- owner = repository_elem.attrib[ 'owner' ]
- changeset_revision = repository_elem.attrib[ 'changeset_revision' ]
- user = None
- repository = None
- if tool_shed_is_this_tool_shed( toolshed ):
- try:
- user = sa_session.query( app.model.User ) \
- .filter( app.model.User.table.c.username == owner ) \
- .one()
- except Exception, e:
- error_message = "Invalid owner %s defined for repository %s. Repository dependencies will be ignored." % ( owner, name )
- log.debug( error_message )
- return new_rd_tups, error_message
- if user:
- try:
- repository = sa_session.query( app.model.Repository ) \
- .filter( and_( app.model.Repository.table.c.name == name,
- app.model.Repository.table.c.user_id == user.id ) ) \
- .first()
- except:
- error_message = "Invalid name %s or owner %s defined for repository. Repository dependencies will be ignored." % ( name, owner )
- log.debug( error_message )
- return new_rd_tups, error_message
- if repository:
- repository_dependencies_tup = ( toolshed, name, owner, changeset_revision )
- if repository_dependencies_tup not in new_rd_tups:
- new_rd_tups.append( repository_dependencies_tup )
- else:
- error_message = "Invalid name %s or owner %s defined for repository. Repository dependencies will be ignored." % ( name, owner )
- log.debug( error_message )
- else:
- error_message = "Invalid owner %s defined for owner of repository %s. Repository dependencies will be ignored." % ( owner, name )
- log.debug( error_message )
- else:
- # Repository dependencies are currentlhy supported within a single tool shed.
- error_message = "Invalid tool shed %s defined for repository %s. " % ( toolshed, name )
- error_message += "Repository dependencies are currently supported within a single tool shed, so your definition will be ignored."
- log.debug( error_message )
- return new_rd_tups, error_message
def has_previous_repository_reviews( trans, repository, changeset_revision ):
"""Determine if a repository has a changeset revision review prior to the received changeset revision."""
repo = hg.repository( get_configured_ui(), repository.repo_path( trans.app ) )
@@ -2588,7 +2604,22 @@
return True
return False
def is_downloadable( metadata_dict ):
- return 'datatypes' in metadata_dict or 'repository_dependencies' in metadata_dict or 'tools' in metadata_dict or 'workflows' in metadata_dict
+ if 'datatypes' in metadata_dict:
+ # We have proprietary datatypes.
+ return True
+ if 'repository_dependencies' in metadata_dict:
+ # We have repository_dependencies.
+ return True
+ if 'tools' in metadata_dict:
+ # We have tools.
+ return True
+ if 'tool_dependencies' in metadata_dict:
+ # We have tool dependencies, and perhaps only tool dependencies!
+ return True
+ if 'workflows' in metadata_dict:
+ # We have exported workflows.
+ return True
+ return False
def initialize_all_repository_dependencies( current_repository_key, repository_dependencies_dict, all_repository_dependencies ):
# Initialize the all_repository_dependencies dictionary. It's safe to assume that current_repository_key in this case will have a value.
all_repository_dependencies[ 'root_key' ] = current_repository_key
@@ -3322,7 +3353,7 @@
return ''.join( translated )
return text
def tool_shed_from_repository_clone_url( repository_clone_url ):
- return clean_repository_clone_url( repository_clone_url ).split( 'repos' )[ 0 ].rstrip( '/' )
+ return clean_repository_clone_url( repository_clone_url ).split( '/repos/' )[ 0 ].rstrip( '/' )
def tool_shed_is_this_tool_shed( toolshed_base_url ):
return toolshed_base_url.rstrip( '/' ) == str( url_for( '/', qualified=True ) ).rstrip( '/' )
def translate_string( raw_text, to_html=True ):
diff -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 -r ea3da2000fa733a2d3ff5d5629dec58d3573645c lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -1466,7 +1466,7 @@
return repo_info_dict
@web.expose
def get_tool_dependencies( self, trans, **kwd ):
- """Handle a request from a Galaxy instance."""
+ """Handle a request from a Galaxy instance to get the tool_dependencies entry from the metadata for a specified changeset revision."""
params = util.Params( kwd )
name = params.get( 'name', None )
owner = params.get( 'owner', None )
@@ -1481,6 +1481,26 @@
return encoding_util.tool_shed_encode( tool_dependencies )
return ''
@web.expose
+ def get_tool_dependencies_config_contents( self, trans, **kwd ):
+ """Handle a request from a Galaxy instance to get the tool_dependencies.xml file contents for a specified changeset revision."""
+ params = util.Params( kwd )
+ name = params.get( 'name', None )
+ owner = params.get( 'owner', None )
+ changeset_revision = params.get( 'changeset_revision', None )
+ repository = suc.get_repository_by_name_and_owner( trans, name, owner )
+ # TODO: We're currently returning the tool_dependencies.xml file that is available on disk. We need to enhance this process
+ # to retrieve older versions of the tool-dependencies.xml file from the repository manafest.
+ repo_dir = repository.repo_path( trans.app )
+ # Get the tool_dependencies.xml file from disk.
+ tool_dependencies_config = suc.get_config_from_disk( 'tool_dependencies.xml', repo_dir )
+ # Return the encoded contents of the tool_dependencies.xml file.
+ if tool_dependencies_config:
+ tool_dependencies_config_file = open( tool_dependencies_config, 'rb' )
+ contents = tool_dependencies_config_file.read()
+ tool_dependencies_config_file.close()
+ return contents
+ return ''
+ @web.expose
def get_tool_versions( self, trans, **kwd ):
"""
For each valid /downloadable change set (up to the received changeset_revision) in the repository's change log, append the change
@@ -1505,7 +1525,7 @@
return ''
@web.json
def get_updated_repository_information( self, trans, name, owner, changeset_revision, **kwd ):
- """Generate a disctionary that contains the information about a repository that is necessary for installing it into a local Galaxy instance."""
+ """Generate a dictionary that contains the information about a repository that is necessary for installing it into a local Galaxy instance."""
repository = suc.get_repository_by_name_and_owner( trans, name, owner )
repository_id = trans.security.encode_id( repository.id )
repository_clone_url = suc.generate_clone_url_for_repository_in_tool_shed( trans, repository )
@@ -2079,7 +2099,7 @@
repository = suc.get_repository_by_name_and_owner( trans, name, owner )
repo_dir = repository.repo_path( trans.app )
repo = hg.repository( suc.get_configured_ui(), repo_dir )
- # Get the lower bound changeset revision
+ # Get the lower bound changeset revision.
lower_bound_changeset_revision = suc.get_previous_downloadable_changset_revision( repository, repo, changeset_revision )
# Build the list of changeset revision hashes.
changeset_hashes = []
@@ -2404,6 +2424,35 @@
if list:
return ','.join( list )
return ''
+ @web.expose
+ def updated_changeset_revisions( self, trans, **kwd ):
+ """
+ Handle a request from a local Galaxy instance to retrieve the lsit of changeset revisions to which an installed repository can be updated. This
+ method will return a string of comma-separated changeset revision hashes for all available updates to the received changeset revision. Among
+ other things , this method handles the scenario where an installed tool shed repository's tool_dependency definition file defines a changeset
+ revision for a complex repository dependency that is outdated. In other words, a defined changeset revision is older than the current changeset
+ revision for the required repository, making it impossible to discover the repository without knowledge of revisions to which it could have been
+ updated.
+ """
+ params = util.Params( kwd )
+ name = params.get( 'name', None )
+ owner = params.get( 'owner', None )
+ changeset_revision = params.get( 'changeset_revision', None )
+ repository = suc.get_repository_by_name_and_owner( trans, name, owner )
+ repo_dir = repository.repo_path( trans.app )
+ repo = hg.repository( suc.get_configured_ui(), repo_dir )
+ # Get the upper bound changeset revision.
+ upper_bound_changeset_revision = suc.get_next_downloadable_changeset_revision( repository, repo, changeset_revision )
+ # Build the list of changeset revision hashes defining each available update up to, but excluding, upper_bound_changeset_revision.
+ changeset_hashes = []
+ for changeset in suc.reversed_lower_upper_bounded_changelog( repo, changeset_revision, upper_bound_changeset_revision ):
+ # Make sure to exclude upper_bound_changeset_revision.
+ if changeset != upper_bound_changeset_revision:
+ changeset_hashes.append( str( repo.changectx( changeset ) ) )
+ if changeset_hashes:
+ changeset_hashes_str = ','.join( changeset_hashes )
+ return changeset_hashes_str
+ return ''
def __validate_repository_name( self, name, user ):
# Repository names must be unique for each user, must be at least four characters
# in length and must contain only lower-case letters, numbers, and the '_' character.
diff -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 -r ea3da2000fa733a2d3ff5d5629dec58d3573645c lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -890,7 +890,7 @@
shed_util.update_tool_shed_repository_status( trans.app, tool_shed_repository, trans.model.ToolShedRepository.installation_status.CLONING )
repo_info_tuple = repo_info_dict[ tool_shed_repository.name ]
description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, tool_dependencies = repo_info_tuple
- relative_clone_dir = shed_util.generate_tool_path( repository_clone_url, tool_shed_repository.installed_changeset_revision )
+ relative_clone_dir = shed_util.generate_tool_shed_repository_install_dir( repository_clone_url, tool_shed_repository.installed_changeset_revision )
clone_dir = os.path.join( tool_path, relative_clone_dir )
relative_install_dir = os.path.join( relative_clone_dir, tool_shed_repository.name )
install_dir = os.path.join( tool_path, relative_install_dir )
@@ -1416,7 +1416,8 @@
install_tool_dependencies = CheckboxField.is_checked( kwd.get( 'install_tool_dependencies', '' ) )
shed_tool_conf, tool_path, relative_install_dir = suc.get_tool_panel_config_tool_path_install_dir( trans.app, tool_shed_repository )
repository_clone_url = suc.generate_clone_url_for_installed_repository( trans.app, tool_shed_repository )
- clone_dir = os.path.join( tool_path, shed_util.generate_tool_path( repository_clone_url, tool_shed_repository.installed_changeset_revision ) )
+ clone_dir = os.path.join( tool_path, shed_util.generate_tool_shed_repository_install_dir( repository_clone_url,
+ tool_shed_repository.installed_changeset_revision ) )
relative_install_dir = os.path.join( clone_dir, tool_shed_repository.name )
tool_shed_url = suc.get_url_from_repository_tool_shed( trans.app, tool_shed_repository )
tool_section = None
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/9f2b4091e1e5/
changeset: 9f2b4091e1e5
user: Richard Park
date: 2013-01-30 21:48:54
summary: Fixed error when retrieving workflows via API. Added additional tab to lines 88 and 89, to fix error when trying to refer to variable "step."
affected #: 1 file
diff -r 2fcef846289917ccf394bb2fabba8f36d55d0039 -r 9f2b4091e1e5114c8bfc692ce0e685cc75ada39b lib/galaxy/webapps/galaxy/api/workflows.py
--- a/lib/galaxy/webapps/galaxy/api/workflows.py
+++ b/lib/galaxy/webapps/galaxy/api/workflows.py
@@ -85,8 +85,8 @@
'type': step.type,
'tool_id': step.tool_id,
'input_steps': {}}
- for conn in step.input_connections:
- steps[step.id]['input_steps'][conn.input_name] = {'source_step': conn.output_step_id,
+ for conn in step.input_connections:
+ steps[step.id]['input_steps'][conn.input_name] = {'source_step': conn.output_step_id,
'step_output': conn.output_name}
item['steps'] = steps
return item
https://bitbucket.org/galaxy/galaxy-central/commits/618c34d9b2ea/
changeset: 618c34d9b2ea
user: dannon
date: 2013-01-30 21:57:10
summary: Merge Pull Request #114 https://bitbucket.org/galaxy/galaxy-central/pull-request/114/fixed-error-wh…
Adjusted spacing.
affected #: 1 file
diff -r a1d543da698d3d127cd9d7fedd61bd1ac5a19872 -r 618c34d9b2eab385e092c9ca2d7dd07b9ea31024 lib/galaxy/webapps/galaxy/api/workflows.py
--- a/lib/galaxy/webapps/galaxy/api/workflows.py
+++ b/lib/galaxy/webapps/galaxy/api/workflows.py
@@ -85,9 +85,9 @@
'type': step.type,
'tool_id': step.tool_id,
'input_steps': {}}
- for conn in step.input_connections:
- steps[step.id]['input_steps'][conn.input_name] = {'source_step': conn.output_step_id,
- 'step_output': conn.output_name}
+ for conn in step.input_connections:
+ steps[step.id]['input_steps'][conn.input_name] = {'source_step': conn.output_step_id,
+ 'step_output': conn.output_name}
item['steps'] = steps
return item
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
30 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/a1d543da698d/
changeset: a1d543da698d
user: james_taylor
date: 2013-01-30 16:32:16
summary: web.framework: remove stray print
affected #: 1 file
diff -r 2fcef846289917ccf394bb2fabba8f36d55d0039 -r a1d543da698d3d127cd9d7fedd61bd1ac5a19872 lib/galaxy/web/framework/__init__.py
--- a/lib/galaxy/web/framework/__init__.py
+++ b/lib/galaxy/web/framework/__init__.py
@@ -278,7 +278,6 @@
if not( fname.startswith( "_" ) ) and fname.endswith( ".py" ):
name = fname[:-3]
module_name = package_name + "." + name
- print package_name, name, module_name
try:
module = import_module( module_name )
except ControllerUnavailable, exc:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
3 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/fdd9dd36d6e6/
changeset: fdd9dd36d6e6
user: inithello
date: 2013-01-30 20:54:41
summary: Enable specifying the location of the migrated tools' XML file.
affected #: 1 file
diff -r 43fa4cb72e0fdc207a07fc4c6402273ca4bd4fc6 -r fdd9dd36d6e6bdd17ecaa219c15a6afeb047748c lib/galaxy/config.py
--- a/lib/galaxy/config.py
+++ b/lib/galaxy/config.py
@@ -57,7 +57,7 @@
self.test_conf = resolve_path( kwargs.get( "test_conf", "" ), self.root )
# The value of migrated_tools_config is the file reserved for containing only those tools that have been eliminated from the distribution
# and moved to the tool shed.
- self.migrated_tools_config = resolve_path( "migrated_tools_conf.xml", self.root )
+ self.migrated_tools_config = resolve_path( kwargs.get( 'migrated_tools_config', 'migrated_tools_conf.xml' ), self.root )
if 'tool_config_file' in kwargs:
tcf = kwargs[ 'tool_config_file' ]
elif 'tool_config_files' in kwargs:
https://bitbucket.org/galaxy/galaxy-central/commits/8f56f551b194/
changeset: 8f56f551b194
user: inithello
date: 2013-01-30 20:55:08
summary: Enable running tool shed functional tests with run_functional_tests.sh.
affected #: 1 file
diff -r fdd9dd36d6e6bdd17ecaa219c15a6afeb047748c -r 8f56f551b194ef92029b88c5a774bf3d13220af3 run_functional_tests.sh
--- a/run_functional_tests.sh
+++ b/run_functional_tests.sh
@@ -6,11 +6,13 @@
if [ ! $1 ]; then
python ./scripts/functional_tests.py -v --with-nosehtml --html-report-file run_functional_tests.html --exclude="^get" functional
elif [ $1 = 'help' ]; then
- echo "'run_functional_tests.sh' for testing all the tools in functional directory"
- echo "'run_functional_tests.sh aaa' for testing one test case of 'aaa' ('aaa' is the file name with path)"
- echo "'run_functional_tests.sh -id bbb' for testing one tool with id 'bbb' ('bbb' is the tool id)"
- echo "'run_functional_tests.sh -sid ccc' for testing one section with sid 'ccc' ('ccc' is the string after 'section::')"
- echo "'run_functional_tests.sh -list' for listing all the tool ids"
+ echo "'run_functional_tests.sh' for testing all the tools in functional directory"
+ echo "'run_functional_tests.sh aaa' for testing one test case of 'aaa' ('aaa' is the file name with path)"
+ echo "'run_functional_tests.sh -id bbb' for testing one tool with id 'bbb' ('bbb' is the tool id)"
+ echo "'run_functional_tests.sh -sid ccc' for testing one section with sid 'ccc' ('ccc' is the string after 'section::')"
+ echo "'run_functional_tests.sh -list' for listing all the tool ids"
+ echo "'run_functional_tests.sh -toolshed' for running all the test scripts in the ./test/tool_shed/functional directory"
+ echo "'run_functional_tests.sh -toolshed testscriptname' for running one test script named testscriptname in the .test/tool_shed/functional directory"
elif [ $1 = '-id' ]; then
python ./scripts/functional_tests.py -v functional.test_toolbox:TestForTool_$2 --with-nosehtml --html-report-file run_functional_tests.html
elif [ $1 = '-sid' ]; then
@@ -38,6 +40,12 @@
else
python ./scripts/functional_tests.py -v functional.test_toolbox --with-nosehtml --html-report-file run_functional_tests.html -installed
fi
+elif [ $1 = '-toolshed' ]; then
+ if [ ! $2 ]; then
+ python ./test/tool_shed/functional_tests.py -v --with-nosehtml --html-report-file ./test/tool_shed/run_functional_tests.html ./test/tool_shed/functional
+ else
+ python ./test/tool_shed/functional_tests.py -v --with-nosehtml --html-report-file ./test/tool_shed/run_functional_tests.html $2
+ fi
else
python ./scripts/functional_tests.py -v --with-nosehtml --html-report-file run_functional_tests.html $1
fi
https://bitbucket.org/galaxy/galaxy-central/commits/2fcef8462899/
changeset: 2fcef8462899
user: inithello
date: 2013-01-30 20:57:16
summary: Pass in a temporary file as migrated_tools_conf, so repositories without tools end up in the right location.
affected #: 1 file
diff -r 8f56f551b194ef92029b88c5a774bf3d13220af3 -r 2fcef846289917ccf394bb2fabba8f36d55d0039 test/tool_shed/functional_tests.py
--- a/test/tool_shed/functional_tests.py
+++ b/test/tool_shed/functional_tests.py
@@ -124,6 +124,7 @@
galaxy_tool_data_table_conf_file = os.environ.get( 'GALAXY_TEST_TOOL_DATA_TABLE_CONF', os.path.join( tool_shed_test_tmp_dir, 'tool_data_table_conf.xml' ) )
galaxy_tool_conf_file = os.environ.get( 'GALAXY_TEST_TOOL_CONF', os.path.join( tool_shed_test_tmp_dir, 'test_tool_conf.xml' ) )
galaxy_shed_tool_conf_file = os.environ.get( 'GALAXY_TEST_SHED_TOOL_CONF', os.path.join( tool_shed_test_tmp_dir, 'test_shed_tool_conf.xml' ) )
+ galaxy_migrated_tool_conf_file = os.environ.get( 'GALAXY_TEST_MIGRATED_TOOL_CONF', os.path.join( tool_shed_test_tmp_dir, 'test_migrated_tool_conf.xml' ) )
galaxy_tool_sheds_conf_file = os.environ.get( 'GALAXY_TEST_TOOL_SHEDS_CONF', os.path.join( tool_shed_test_tmp_dir, 'test_sheds_conf.xml' ) )
if 'GALAXY_TEST_TOOL_DATA_PATH' in os.environ:
tool_data_path = os.environ.get( 'GALAXY_TEST_TOOL_DATA_PATH' )
@@ -141,6 +142,7 @@
new_repos_path = tempfile.mkdtemp( dir=tool_shed_test_tmp_dir )
galaxy_tempfiles = tempfile.mkdtemp( dir=tool_shed_test_tmp_dir )
galaxy_shed_tool_path = tempfile.mkdtemp( dir=tool_shed_test_tmp_dir )
+ galaxy_migrated_tool_path = tempfile.mkdtemp( dir=tool_shed_test_tmp_dir )
galaxy_tool_dependency_dir = tempfile.mkdtemp( dir=tool_shed_test_tmp_dir )
os.environ[ 'GALAXY_TEST_TOOL_DEPENDENCY_DIR' ] = galaxy_tool_dependency_dir
if 'TOOL_SHED_TEST_DBURI' in os.environ:
@@ -258,6 +260,9 @@
shed_tool_conf_template_parser = string.Template( shed_tool_conf_xml_template )
shed_tool_conf_xml = shed_tool_conf_template_parser.safe_substitute( shed_tool_path=galaxy_shed_tool_path )
file( galaxy_shed_tool_conf_file, 'w' ).write( shed_tool_conf_xml )
+ # Generate the migrated_tool_conf.xml file.
+ migrated_tool_conf_xml = shed_tool_conf_template_parser.safe_substitute( shed_tool_path=galaxy_migrated_tool_path )
+ file( galaxy_migrated_tool_conf_file, 'w' ).write( migrated_tool_conf_xml )
os.environ[ 'GALAXY_TEST_SHED_TOOL_CONF' ] = galaxy_shed_tool_conf_file
# ---- Build Galaxy Application --------------------------------------------------
@@ -275,6 +280,7 @@
tool_data_path = tool_data_path,
shed_tool_path = galaxy_shed_tool_path,
update_integrated_tool_panel = False,
+ migrated_tools_config = galaxy_migrated_tool_conf_file,
tool_config_file = [ galaxy_tool_conf_file, galaxy_shed_tool_conf_file ],
tool_sheds_config_file = galaxy_tool_sheds_conf_file,
datatype_converters_config_file = "datatype_converters_conf.xml.sample",
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Add a util.move_merge() function that makes moving directories more consistent.
by Bitbucket 30 Jan '13
by Bitbucket 30 Jan '13
30 Jan '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/43fa4cb72e0f/
changeset: 43fa4cb72e0f
user: dan
date: 2013-01-30 19:49:43
summary: Add a util.move_merge() function that makes moving directories more consistent.
affected #: 1 file
diff -r a9cbfdfeff11e2e2595b37825584216441a7a6d1 -r 43fa4cb72e0fdc207a07fc4c6402273ca4bd4fc6 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -2,7 +2,7 @@
Utility functions used systemwide.
"""
-import logging, threading, random, string, re, binascii, pickle, time, datetime, math, re, os, sys, tempfile, stat, grp, smtplib, errno
+import logging, threading, random, string, re, binascii, pickle, time, datetime, math, re, os, sys, tempfile, stat, grp, smtplib, errno, shutil
from email.MIMEText import MIMEText
# Older py compatibility
@@ -737,6 +737,18 @@
else:
raise e
+def move_merge( source, target ):
+ #when using shutil and moving a directory, if the target exists,
+ #then the directory is placed inside of it
+ #if the target doesn't exist, then the target is made into the directory
+ #this makes it so that the target is always the target, and if it exists,
+ #the source contents are moved into the target
+ if os.path.isdir( source ) and os.path.exists( target ) and os.path.isdir( target ):
+ for name in os.listdir( source ):
+ move_merge( os.path.join( source, name ), os.path.join( target, name ) )
+ else:
+ return shutil.move( source, target )
+
galaxy_root_path = os.path.join(__path__[0], "..","..","..")
# The dbnames list is used in edit attributes and the upload tool
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0