galaxy-commits
Threads by month
- ----- 2025 -----
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- 15302 discussions

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/a66d849924d2
changeset: 3689:a66d849924d2
user: Nate Coraor <nate(a)bx.psu.edu>
date: Fri Apr 23 16:24:49 2010 -0400
description:
Move setting categories to the Tool class in the model
diffstat:
lib/galaxy/webapps/community/controllers/tool_browser.py | 17 +---------------
lib/galaxy/webapps/community/model/__init__.py | 9 ++++++++
2 files changed, 10 insertions(+), 16 deletions(-)
diffs (50 lines):
diff -r 87cee993fa2d -r a66d849924d2 lib/galaxy/webapps/community/controllers/tool_browser.py
--- a/lib/galaxy/webapps/community/controllers/tool_browser.py Fri Apr 23 16:11:55 2010 -0400
+++ b/lib/galaxy/webapps/community/controllers/tool_browser.py Fri Apr 23 16:24:49 2010 -0400
@@ -117,7 +117,7 @@
elif params.save_button:
tool.user_description = util.restore_text( params.description )
categories = []
- set_tool_category_associations( trans, tool, util.listify( params.category ) )
+ tool.set_categories( trans, util.listify( params.category ) )
trans.sa_session.add( tool )
trans.sa_session.flush()
return trans.response.send_redirect( web.url_for( controller='tool_browser',
@@ -131,18 +131,3 @@
categories=categories,
message=message,
status=status )
-
-## ---- Utility methods -------------------------------------------------------
-
-# It may make sense to create something like the security controller to do
-# this, but seems unnecessary for this single operation
-
-def set_tool_category_associations( trans, tool, categories, delete_existing_assocs=True ):
- if delete_existing_assocs:
- for a in tool.categories:
- trans.sa_session.delete( a )
- trans.sa_session.flush()
- for category in categories:
- if not isinstance( category, trans.model.Category ):
- category = trans.sa_session.query( trans.model.Category ).get( int( category ) )
- tool.categories.append( trans.model.ToolCategoryAssociation( tool, category ) )
diff -r 87cee993fa2d -r a66d849924d2 lib/galaxy/webapps/community/model/__init__.py
--- a/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 16:11:55 2010 -0400
+++ b/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 16:24:49 2010 -0400
@@ -123,6 +123,15 @@
self.description = datatype_bunch.description
self.version = datatype_bunch.version
self.user_id = datatype_bunch.user.id
+ def set_categories( self, trans, categories, delete_existing_assocs=True ):
+ if delete_existing_assocs:
+ for a in self.categories:
+ trans.sa_session.delete( a )
+ trans.sa_session.flush()
+ for category in categories:
+ if not isinstance( category, Category ):
+ category = trans.sa_session.query( Category ).get( int( category ) )
+ self.categories.append( ToolCategoryAssociation( self, category ) )
class Tag ( object ):
def __init__( self, id=None, type=None, parent_id=None, name=None ):
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/87cee993fa2d
changeset: 3688:87cee993fa2d
user: Nate Coraor <nate(a)bx.psu.edu>
date: Fri Apr 23 16:11:55 2010 -0400
description:
Add categories to the community app
diffstat:
lib/galaxy/webapps/community/controllers/admin.py | 275 ++++++++++
lib/galaxy/webapps/community/controllers/tool_browser.py | 26 +-
lib/galaxy/webapps/community/controllers/upload.py | 7 +-
lib/galaxy/webapps/community/model/__init__.py | 7 +-
lib/galaxy/webapps/community/model/mapping.py | 3 +-
lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py | 3 +-
templates/webapps/community/admin/index.mako | 9 +
templates/webapps/community/base_panels.mako | 2 +-
templates/webapps/community/tool/edit_tool.mako | 4 +-
9 files changed, 323 insertions(+), 13 deletions(-)
diffs (452 lines):
diff -r 318dc4410301 -r 87cee993fa2d lib/galaxy/webapps/community/controllers/admin.py
--- a/lib/galaxy/webapps/community/controllers/admin.py Fri Apr 23 15:31:52 2010 -0400
+++ b/lib/galaxy/webapps/community/controllers/admin.py Fri Apr 23 16:11:55 2010 -0400
@@ -277,8 +277,283 @@
def build_initial_query( self, session ):
return session.query( self.model_class )
+class CategoryListGrid( grids.Grid ):
+ class NameColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, category ):
+ return category.name
+ class DescriptionColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, category ):
+ return category.description
+ class StatusColumn( grids.GridColumn ):
+ def get_value( self, trans, grid, category ):
+ if category.deleted:
+ return "deleted"
+ return ""
+
+ # Grid definition
+ webapp = "community"
+ title = "Categories"
+ model_class = model.Category
+ template='/webapps/community/admin/category/grid.mako'
+ default_sort_key = "name"
+ columns = [
+ NameColumn( "Name",
+ key="name",
+ link=( lambda item: dict( operation="Edit category", id=item.id, webapp="community" ) ),
+ model_class=model.Category,
+ attach_popup=True,
+ filterable="advanced" ),
+ DescriptionColumn( "Description", attach_popup=False ),
+ StatusColumn( "Status", attach_popup=False ),
+ # Columns that are valid for filtering but are not visible.
+ grids.DeletedColumn( "Deleted", key="deleted", visible=False, filterable="advanced" )
+ ]
+ columns.append( grids.MulticolFilterColumn( "Search",
+ cols_to_filter=[ columns[0], columns[1], columns[2] ],
+ key="free-text-search",
+ visible=False,
+ filterable="standard" ) )
+ global_actions = [
+ grids.GridAction( "Add new category",
+ dict( controller='admin', action='categories', operation='create', webapp="community" ) )
+ ]
+ operations = [ grids.GridOperation( "Rename",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=False,
+ url_args=dict( webapp="community", action="rename_category" ) ),
+ grids.GridOperation( "Delete",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="community", action="mark_category_deleted" ) ),
+ grids.GridOperation( "Undelete",
+ condition=( lambda item: item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="community", action="undelete_category" ) ),
+ grids.GridOperation( "Purge",
+ condition=( lambda item: item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="community", action="purge_category" ) ) ]
+ standard_filters = [
+ grids.GridColumnFilter( "Active", args=dict( deleted=False ) ),
+ grids.GridColumnFilter( "Deleted", args=dict( deleted=True ) ),
+ grids.GridColumnFilter( "All", args=dict( deleted='All' ) )
+ ]
+ num_rows_per_page = 50
+ preserve_state = False
+ use_paging = True
+ def get_current_item( self, trans ):
+ return None
+ def build_initial_query( self, session ):
+ return session.query( self.model_class )
+
class AdminCommunity( BaseController, Admin ):
user_list_grid = UserListGrid()
role_list_grid = RoleListGrid()
group_list_grid = GroupListGrid()
+ category_list_grid = CategoryListGrid()
+
+ @web.expose
+ @web.require_admin
+ def categories( self, trans, **kwargs ):
+ if 'operation' in kwargs:
+ operation = kwargs['operation'].lower()
+ if operation == "create":
+ return self.create_category( trans, **kwargs )
+ if operation == "delete":
+ return self.mark_category_deleted( trans, **kwargs )
+ if operation == "undelete":
+ return self.undelete_category( trans, **kwargs )
+ if operation == "purge":
+ return self.purge_category( trans, **kwargs )
+ if operation == "rename":
+ return self.rename_category( trans, **kwargs )
+ # Render the list view
+ return self.category_list_grid( trans, **kwargs )
+
+ @web.expose
+ @web.require_admin
+ def create_category( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'community' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ if params.get( 'create_category_button', False ):
+ name = util.restore_text( params.name )
+ description = util.restore_text( params.description )
+ if not name or not description:
+ message = "Enter a valid name and a description"
+ elif trans.sa_session.query( trans.app.model.Category ).filter( trans.app.model.Category.table.c.name==name ).first():
+ message = "A category with that name already exists"
+ else:
+ # Create the category
+ category = trans.app.model.Category( name=name, description=description )
+ trans.sa_session.add( category )
+ message = "Category '%s' has been created" % category.name
+ trans.sa_session.flush()
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='create_category',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ return trans.fill_template( '/webapps/community/admin/category/category_create.mako',
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def rename_category( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ id = params.get( 'id', None )
+ if not id:
+ message = "No category ids received for renaming"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ category = get_category( trans, id )
+ if params.get( 'rename_category_button', False ):
+ old_name = category.name
+ new_name = util.restore_text( params.name )
+ new_description = util.restore_text( params.description )
+ if not new_name:
+ message = 'Enter a valid name'
+ status = 'error'
+ elif trans.sa_session.query( trans.app.model.Category ).filter( trans.app.model.Category.table.c.name==new_name ).first():
+ message = 'A category with that name already exists'
+ status = 'error'
+ else:
+ category.name = new_name
+ category.description = new_description
+ trans.sa_session.add( category )
+ trans.sa_session.flush()
+ message = "Category '%s' has been renamed to '%s'" % ( old_name, new_name )
+ return trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ return trans.fill_template( '/webapps/community/admin/category/category_rename.mako',
+ category=category,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def mark_category_deleted( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No category ids received for deleting"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Deleted %d categories: " % len( ids )
+ for category_id in ids:
+ category = get_category( trans, category_id )
+ category.deleted = True
+ trans.sa_session.add( category )
+ trans.sa_session.flush()
+ message += " %s " % category.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def undelete_category( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No category ids received for undeleting"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ count = 0
+ undeleted_categories = ""
+ for category_id in ids:
+ category = get_category( trans, category_id )
+ if not category.deleted:
+ message = "Category '%s' has not been deleted, so it cannot be undeleted." % category.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ category.deleted = False
+ trans.sa_session.add( category )
+ trans.sa_session.flush()
+ count += 1
+ undeleted_categories += " %s" % category.name
+ message = "Undeleted %d categories: %s" % ( count, undeleted_categories )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def purge_category( self, trans, **kwd ):
+ # This method should only be called for a Category that has previously been deleted.
+ # Purging a deleted Category deletes all of the following from the database:
+ # - ToolCategoryAssociations where category_id == Category.id
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No category ids received for purging"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Purged %d categories: " % len( ids )
+ for category_id in ids:
+ category = get_category( trans, category_id )
+ if not category.deleted:
+ message = "Category '%s' has not been deleted, so it cannot be purged." % category.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ # Delete ToolCategoryAssociations
+ for tca in category.tools:
+ trans.sa_session.delete( tca )
+ trans.sa_session.flush()
+ message += " %s " % category.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='categories',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+
+## ---- Utility methods -------------------------------------------------------
+
+def get_category( trans, id ):
+ """Get a User from the database by id."""
+ # Load user from database
+ id = trans.security.decode_id( id )
+ category = trans.sa_session.query( trans.model.Category ).get( id )
+ if not category:
+ return trans.show_error_message( "Category not found for id (%s)" % str( id ) )
+ return category
diff -r 318dc4410301 -r 87cee993fa2d lib/galaxy/webapps/community/controllers/tool_browser.py
--- a/lib/galaxy/webapps/community/controllers/tool_browser.py Fri Apr 23 15:31:52 2010 -0400
+++ b/lib/galaxy/webapps/community/controllers/tool_browser.py Fri Apr 23 16:11:55 2010 -0400
@@ -115,8 +115,15 @@
message = 'Uploading new version not implemented'
status = 'error'
elif params.save_button:
- tool.user_description = params.description
- tool.category = params.category
+ tool.user_description = util.restore_text( params.description )
+ categories = []
+ set_tool_category_associations( trans, tool, util.listify( params.category ) )
+ trans.sa_session.add( tool )
+ trans.sa_session.flush()
+ return trans.response.send_redirect( web.url_for( controller='tool_browser',
+ action='browse_tools',
+ message='Saved categories and description for %s' % tool.name,
+ status='done' ) )
categories = trans.sa_session.query( trans.model.Category ).order_by( trans.model.Category.table.c.name ).all()
return trans.fill_template( '/webapps/community/tool/edit_tool.mako',
encoded_id = encoded_id,
@@ -124,3 +131,18 @@
categories=categories,
message=message,
status=status )
+
+## ---- Utility methods -------------------------------------------------------
+
+# It may make sense to create something like the security controller to do
+# this, but seems unnecessary for this single operation
+
+def set_tool_category_associations( trans, tool, categories, delete_existing_assocs=True ):
+ if delete_existing_assocs:
+ for a in tool.categories:
+ trans.sa_session.delete( a )
+ trans.sa_session.flush()
+ for category in categories:
+ if not isinstance( category, trans.model.Category ):
+ category = trans.sa_session.query( trans.model.Category ).get( int( category ) )
+ tool.categories.append( trans.model.ToolCategoryAssociation( tool, category ) )
diff -r 318dc4410301 -r 87cee993fa2d lib/galaxy/webapps/community/controllers/upload.py
--- a/lib/galaxy/webapps/community/controllers/upload.py Fri Apr 23 15:31:52 2010 -0400
+++ b/lib/galaxy/webapps/community/controllers/upload.py Fri Apr 23 16:11:55 2010 -0400
@@ -49,12 +49,15 @@
os.link( uploaded_file.name, obj.file_name )
except OSError:
shutil.copy( uploaded_file.name, obj.file_name )
- message = 'Uploaded %s' % meta.message
+ return trans.response.send_redirect( web.url_for( controller='tool_browser',
+ action='edit_tool',
+ message='Uploaded %s' % meta.message,
+ status='done' ) )
except datatypes.DatatypeVerificationError, e:
message = str( e )
status = 'error'
except sqlalchemy.exc.IntegrityError:
- message = 'A tool with the same ID already exists. If you are trying to update this tool to a new version, please ... ??? ... Otherwise, please choose a new ID.'
+ message = 'A tool with the same ID already exists. If you are trying to update this tool to a new version, please use the upload form on the "Edit Tool" page. Otherwise, please choose a new ID.'
status = 'error'
uploaded_file.close()
selected_upload_type = params.get( 'type', 'tool' )
diff -r 318dc4410301 -r 87cee993fa2d lib/galaxy/webapps/community/model/__init__.py
--- a/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 15:31:52 2010 -0400
+++ b/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 16:11:55 2010 -0400
@@ -134,10 +134,10 @@
return "Tag(id=%s, type=%i, parent_id=%s, name=%s)" % ( self.id, self.type, self.parent_id, self.name )
class Category( object ):
- def __init__( self, id=None, name=None, description=None ):
- self.id = id
+ def __init__( self, name=None, description=None, deleted=False ):
self.name = name
self.description = description
+ self.deleted = deleted
class ItemTagAssociation ( object ):
def __init__( self, id=None, user=None, item_id=None, tag_id=None, user_tname=None, value=None ):
@@ -156,8 +156,7 @@
pass
class ToolCategoryAssociation( object ):
- def __init__( self, id=None, tool=None, category=None ):
- self.id = id
+ def __init__( self, tool=None, category=None ):
self.tool = tool
self.category = category
diff -r 318dc4410301 -r 87cee993fa2d lib/galaxy/webapps/community/model/mapping.py
--- a/lib/galaxy/webapps/community/model/mapping.py Fri Apr 23 15:31:52 2010 -0400
+++ b/lib/galaxy/webapps/community/model/mapping.py Fri Apr 23 16:11:55 2010 -0400
@@ -119,7 +119,8 @@
Column( "create_time", DateTime, default=now ),
Column( "update_time", DateTime, default=now, onupdate=now ),
Column( "name", TrimmedString( 255 ), index=True, unique=True ),
- Column( "description" , TEXT ) )
+ Column( "description" , TEXT ),
+ Column( "deleted", Boolean, index=True, default=False ) )
ToolCategoryAssociation.table = Table( "tool_category_association", metadata,
Column( "id", Integer, primary_key=True ),
diff -r 318dc4410301 -r 87cee993fa2d lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py
--- a/lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py Fri Apr 23 15:31:52 2010 -0400
+++ b/lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py Fri Apr 23 16:11:55 2010 -0400
@@ -96,7 +96,8 @@
Column( "create_time", DateTime, default=now ),
Column( "update_time", DateTime, default=now, onupdate=now ),
Column( "name", TrimmedString( 255 ), index=True, unique=True ),
- Column( "description" , TEXT ) )
+ Column( "description" , TEXT ),
+ Column( "deleted", Boolean, index=True, default=False ) )
ToolCategoryAssociation_table = Table( "tool_category_association", metadata,
Column( "id", Integer, primary_key=True ),
diff -r 318dc4410301 -r 87cee993fa2d templates/webapps/community/admin/index.mako
--- a/templates/webapps/community/admin/index.mako Fri Apr 23 15:31:52 2010 -0400
+++ b/templates/webapps/community/admin/index.mako Fri Apr 23 16:11:55 2010 -0400
@@ -88,6 +88,15 @@
<div class="toolTitle"><a href="${h.url_for( controller='tool_browser', action='browse_tools', webapp='community' )}" target="galaxy_main">Manage tools</a></div>
</div>
</div>
+ <div class="toolSectionPad"></div>
+ <div class="toolSectionTitle">
+ <span>Community</span>
+ </div>
+ <div class="toolSectionBody">
+ <div class="toolSectionBg">
+ <div class="toolTitle"><a href="${h.url_for( controller='admin', action='categories', webapp='community' )}" target="galaxy_main">Manage categories</a></div>
+ </div>
+ </div>
</div>
</div>
</div>
diff -r 318dc4410301 -r 87cee993fa2d templates/webapps/community/base_panels.mako
--- a/templates/webapps/community/base_panels.mako Fri Apr 23 15:31:52 2010 -0400
+++ b/templates/webapps/community/base_panels.mako Fri Apr 23 16:11:55 2010 -0400
@@ -92,7 +92,7 @@
<div class="title" style="position: absolute; top: 0; left: 0;">
<a href="${app.config.get( 'logo_url', '/' )}">
<img border="0" src="${h.url_for('/static/images/galaxyIcon_noText.png')}" style="width: 26px; vertical-align: top;">
- Galaxy
+ Galaxy Community
%if app.config.brand:
<span class='brand'>/ ${app.config.brand}</span>
%endif
diff -r 318dc4410301 -r 87cee993fa2d templates/webapps/community/tool/edit_tool.mako
--- a/templates/webapps/community/tool/edit_tool.mako Fri Apr 23 15:31:52 2010 -0400
+++ b/templates/webapps/community/tool/edit_tool.mako Fri Apr 23 16:11:55 2010 -0400
@@ -25,7 +25,7 @@
<div class="form-row">
<label>Categories:</label>
<div class="form-row-input">
- <select name="category" multiple size=5>
+ <select name="category" multiple size=5 style="min-width: 250px;">
%for category in categories:
%if category.id in [ tool_category.id for tool_category in tool.categories ]:
<option value="${category.id}" selected>${category.name}</option>
@@ -39,7 +39,7 @@
</div>
<div class="form-row">
<label>Description:</label>
- <div class="form-row-input"><textarea name="description" rows="5" cols="35"></textarea></div>
+ <div class="form-row-input"><textarea name="description" rows="5" cols="35">${tool.user_description}</textarea></div>
<div style="clear: both"></div>
</div>
<div class="form-row">
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/318dc4410301
changeset: 3687:318dc4410301
user: rc
date: Fri Apr 23 15:31:52 2010 -0400
description:
lims:
- ui cleanup
- fixed a functional test bug
diffstat:
templates/admin/requests/get_data.mako | 5 ++---
test/functional/test_forms_and_requests.py | 15 ++++++++++-----
2 files changed, 12 insertions(+), 8 deletions(-)
diffs (44 lines):
diff -r f6e86e26cfe2 -r 318dc4410301 templates/admin/requests/get_data.mako
--- a/templates/admin/requests/get_data.mako Fri Apr 23 15:16:24 2010 -0400
+++ b/templates/admin/requests/get_data.mako Fri Apr 23 15:31:52 2010 -0400
@@ -113,9 +113,8 @@
</div>
<div class="form-row">
<div class="toolParamHelp" style="clear: both;">
- After clicking <b>Transfer</b> do <i>not</i> close this page or
- navigate away from this page. Once the transfer is complete
- the dataset(s) will show up on this page.
+ After selecting dataset(s), be sure to click on the <b>Start transfer</b> button.
+ Once the transfer is complete the dataset(s) will show up on this page.
</div>
<input type="submit" name="select_files_button" value="Select"/>
</div>
diff -r f6e86e26cfe2 -r 318dc4410301 test/functional/test_forms_and_requests.py
--- a/test/functional/test_forms_and_requests.py Fri Apr 23 15:16:24 2010 -0400
+++ b/test/functional/test_forms_and_requests.py Fri Apr 23 15:31:52 2010 -0400
@@ -372,15 +372,20 @@
% ( request_two.name, request_two.states.REJECTED )
def test_055_reset_data_for_later_test_runs( self ):
"""Reseting data to enable later test runs to pass"""
- # TODO: RC: add whatever is missing from this method that should be marked
- # deleted or purged so that later test runs will correctly test features if the
- # database has not be purged.
- #
# Logged in as admin_user
+ # remove the request_type permissions
+ rt_actions = sa_session.query( galaxy.model.RequestTypePermissions ) \
+ .filter(and_(galaxy.model.RequestTypePermissions.table.c.request_type_id==request_type.id) ) \
+ .order_by( desc( galaxy.model.RequestTypePermissions.table.c.create_time ) ) \
+ .all()
+ for a in rt_actions:
+ sa_session.delete( a )
+ sa_session.flush()
+
##################
# Eliminate all non-private roles
##################
- for role in [ role_one ]:
+ for role in [ role_one, role_two ]:
self.mark_role_deleted( self.security.encode_id( role.id ), role.name )
self.purge_role( self.security.encode_id( role.id ), role.name )
# Manually delete the role from the database
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/f6e86e26cfe2
changeset: 3686:f6e86e26cfe2
user: Greg Von Kuster <greg(a)bx.psu.edu>
date: Fri Apr 23 15:16:24 2010 -0400
description:
Miscellaneous community space fixes.
diffstat:
lib/galaxy/web/base/controller.py | 6 +-
lib/galaxy/web/framework/__init__.py | 6 +-
lib/galaxy/web/security/__init__.py | 6 +-
lib/galaxy/webapps/community/controllers/admin.py | 1 -
lib/galaxy/webapps/community/controllers/upload.py | 10 +-
lib/galaxy/webapps/community/model/__init__.py | 7 +-
setup.sh | 2 +-
templates/admin/center.mako | 187 ---------------------
templates/webapps/community/admin/center.mako | 41 ++++
templates/webapps/community/admin/index.mako | 20 +-
templates/webapps/galaxy/admin/center.mako | 187 +++++++++++++++++++++
templates/webapps/galaxy/admin/index.mako | 2 +-
12 files changed, 265 insertions(+), 210 deletions(-)
diffs (620 lines):
diff -r 34eec4d48cc4 -r f6e86e26cfe2 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py Fri Apr 23 13:16:40 2010 -0400
+++ b/lib/galaxy/web/base/controller.py Fri Apr 23 15:16:24 2010 -0400
@@ -330,7 +330,11 @@
@web.expose
@web.require_admin
def center( self, trans, **kwd ):
- return trans.fill_template( '/admin/center.mako' )
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ if webapp == 'galaxy':
+ return trans.fill_template( '/webapps/galaxy/admin/center.mako' )
+ else:
+ return trans.fill_template( '/webapps/community/admin/center.mako' )
@web.expose
@web.require_admin
def reload_tool( self, trans, **kwd ):
diff -r 34eec4d48cc4 -r f6e86e26cfe2 lib/galaxy/web/framework/__init__.py
--- a/lib/galaxy/web/framework/__init__.py Fri Apr 23 13:16:40 2010 -0400
+++ b/lib/galaxy/web/framework/__init__.py Fri Apr 23 15:16:24 2010 -0400
@@ -268,7 +268,7 @@
galaxy_session_requires_flush = False
if secure_id:
# Decode the cookie value to get the session_key
- session_key = self.security.decode_session_key( secure_id )
+ session_key = self.security.decode_guid( secure_id )
try:
# Make sure we have a valid UTF-8 string
session_key = session_key.encode( 'utf8' )
@@ -365,7 +365,7 @@
Caller is responsible for flushing the returned session.
"""
- session_key = self.security.get_new_session_key()
+ session_key = self.security.get_new_guid()
galaxy_session = self.app.model.GalaxySession(
session_key=session_key,
is_valid=True,
@@ -411,7 +411,7 @@
"""
Update the session cookie to match the current session.
"""
- self.set_cookie( self.security.encode_session_key( self.galaxy_session.session_key ), name=name, path=self.app.config.cookie_path )
+ self.set_cookie( self.security.encode_guid( self.galaxy_session.session_key ), name=name, path=self.app.config.cookie_path )
def handle_user_login( self, user, webapp ):
"""
Login a new user (possibly newly created)
diff -r 34eec4d48cc4 -r f6e86e26cfe2 lib/galaxy/web/security/__init__.py
--- a/lib/galaxy/web/security/__init__.py Fri Apr 23 13:16:40 2010 -0400
+++ b/lib/galaxy/web/security/__init__.py Fri Apr 23 15:16:24 2010 -0400
@@ -43,16 +43,16 @@
return self.id_cipher.encrypt( s ).encode( 'hex' )
def decode_id( self, obj_id ):
return int( self.id_cipher.decrypt( obj_id.decode( 'hex' ) ).lstrip( "!" ) )
- def encode_session_key( self, session_key ):
+ def encode_guid( self, session_key ):
# Session keys are strings
# Pad to a multiple of 8 with leading "!"
s = ( "!" * ( 8 - len( session_key ) % 8 ) ) + session_key
# Encrypt
return self.id_cipher.encrypt( s ).encode( 'hex' )
- def decode_session_key( self, session_key ):
+ def decode_guid( self, session_key ):
# Session keys are strings
return self.id_cipher.decrypt( session_key.decode( 'hex' ) ).lstrip( "!" )
- def get_new_session_key( self ):
+ def get_new_guid( self ):
# Generate a unique, high entropy 128 bit random number
return get_random_bytes( 16 )
\ No newline at end of file
diff -r 34eec4d48cc4 -r f6e86e26cfe2 lib/galaxy/webapps/community/controllers/admin.py
--- a/lib/galaxy/webapps/community/controllers/admin.py Fri Apr 23 13:16:40 2010 -0400
+++ b/lib/galaxy/webapps/community/controllers/admin.py Fri Apr 23 15:16:24 2010 -0400
@@ -43,7 +43,6 @@
return self.format( user.galaxy_sessions[ 0 ].update_time )
return 'never'
- log.debug("####In UserListGrid, in community" )
# Grid definition
webapp = "community"
title = "Users"
diff -r 34eec4d48cc4 -r f6e86e26cfe2 lib/galaxy/webapps/community/controllers/upload.py
--- a/lib/galaxy/webapps/community/controllers/upload.py Fri Apr 23 13:16:40 2010 -0400
+++ b/lib/galaxy/webapps/community/controllers/upload.py Fri Apr 23 15:16:24 2010 -0400
@@ -41,6 +41,7 @@
try:
meta = datatype.verify( uploaded_file )
meta.user = trans.user
+ meta.guid = trans.app.security.get_new_guid()
obj = datatype.create_model_object( meta )
trans.sa_session.add( obj )
trans.sa_session.flush()
@@ -57,7 +58,8 @@
status = 'error'
uploaded_file.close()
selected_upload_type = params.get( 'type', 'tool' )
- return trans.fill_template( '/webapps/community/upload/upload.mako', message=message,
- status=status,
- selected_upload_type=selected_upload_type,
- upload_types=trans.app.datatypes_registry.get_datatypes_for_select_list() )
+ return trans.fill_template( '/webapps/community/upload/upload.mako',
+ message=message,
+ status=status,
+ selected_upload_type=selected_upload_type,
+ upload_types=trans.app.datatypes_registry.get_datatypes_for_select_list() )
diff -r 34eec4d48cc4 -r f6e86e26cfe2 lib/galaxy/webapps/community/model/__init__.py
--- a/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 13:16:40 2010 -0400
+++ b/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 15:16:24 2010 -0400
@@ -93,7 +93,6 @@
self.name = name or "Unnamed tool"
self.description = description
self.user_description = user_description
- self.category = category
self.version = version or "1.0.0"
self.user_id = user_id
self.external_filename = external_filename
@@ -117,11 +116,13 @@
self.external_filename = filename
file_name = property( get_file_name, set_file_name )
def create_from_datatype( self, datatype_bunch ):
+ # TODO: ensure guid is unique and generate a new one if not.
+ self.guid = datatype_bunch.guid
self.tool_id = datatype_bunch.id
self.name = datatype_bunch.name
+ self.description = datatype_bunch.description
self.version = datatype_bunch.version
- self.description = datatype_bunch.description
- self.user_id = datatype_bunch.user
+ self.user_id = datatype_bunch.user.id
class Tag ( object ):
def __init__( self, id=None, type=None, parent_id=None, name=None ):
diff -r 34eec4d48cc4 -r f6e86e26cfe2 setup.sh
--- a/setup.sh Fri Apr 23 13:16:40 2010 -0400
+++ b/setup.sh Fri Apr 23 15:16:24 2010 -0400
@@ -31,7 +31,7 @@
DIRS="
database
database/files
-database/tools
+database/community_files
database/tmp
database/compiled_templates
database/job_working_directory
diff -r 34eec4d48cc4 -r f6e86e26cfe2 templates/admin/center.mako
--- a/templates/admin/center.mako Fri Apr 23 13:16:40 2010 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,187 +0,0 @@
-<%inherit file="/base.mako"/>
-
-<%def name="title()">Galaxy Administration</%def>
-
-<h2>Administration</h2>
-
-<p>The menu on the left provides the following features</p>
-<ul>
- <li><strong>Security</strong> - see the <strong>Data Security and Data Libraries</strong> section below for details
- <p/>
- <ul>
- <li>
- <strong>Manage users</strong> - provides a view of the registered users and all groups and non-private roles associated
- with each user.
- </li>
- <p/>
- <li>
- <strong>Manage groups</strong> - provides a view of all groups along with the members of the group and the roles associated with
- each group (both private and non-private roles). The group names include a link to a page that allows you to manage the users and
- roles that are associated with the group.
- </li>
- <p/>
- <li>
- <strong>Manage roles</strong> - provides a view of all non-private roles along with the role type, and the users and groups that
- are associated with the role. The role names include a link to a page that allows you to manage the users and groups that are associated
- with the role. The page also includes a view of the data library datasets that are associated with the role and the permissions applied
- to each dataset.
- </li>
- </ul>
- </li>
- <p/>
- <li><strong>Data</strong>
- <p/>
- <ul>
- <li>
- <strong>Manage data libraries</strong> - Data libraries enable a Galaxy administrator to upload datasets into a data library. Currently,
- only administrators can create data libraries.
- <p/>
- When a data library is first created, it is considered "public" since it will be displayed in the "Data Libraries" view for any user, even
- those that are not logged in. The Galaxy administrator can restrict access to a data library by associating roles with the data library's
- "access library" permission. This permission will conservatively override the [dataset] "access" permission for the data library's contained
- datasets.
- <p/>
- For example, if a data library's "access library" permission is associated with Role1 and the data library contains "public" datasets, the
- data library will still only be displayed to those users that have Role1. However, if the data library's "access library" permission is
- associated with both Role1 and Role2 and the data library contains datasets whose [dataset] "access" permission is associated with only Role1,
- then users that have Role2 will be able to access the library, but will not see those contained datasets whose [dataset] "access" permission
- is associated with only Role1.
- <p/>
- In addition to the "access library" permission, permission to perform the following functions on the data library (and it's contents) can
- be granted to users (a library item is one of: a data library, a library folder, a library dataset).
- <p/>
- <ul>
- <li><strong>add library item</strong> - Role members can add library items to this data library or folder</li>
- <li><strong>modify library item</strong> - Role members can modify this library item</li>
- <li><strong>manage library permissions</strong> - Role members can manage permissions applied to this library item</li>
- </ul>
- <p/>
- The default behavior is for no permissions to be applied to a data library item, but applied permissions are inherited downward (with the exception
- of the "access library" permission, which is only available on the data library itself). Because of this, it is important to set desired permissions
- on a new data library when it is created. When this is done, new folders and datasets added to the data library will automatically inherit those
- permissions. In the same way, permissions can be applied to a folder, which will be automatically inherited by all contained datasets and sub-folders.
- <p/>
- The "Data Libraries" menu item allows users to access the datasets in a data library as long as they are not restricted from accessing them.
- Importing a library dataset into a history will not make a copy of the dataset, but will be a "pointer" to the dataset on disk. This
- approach allows for multiple users to use a single (possibly very large) dataset file.
- </li>
- </ul>
- </li>
- <p/>
- <li><strong>Server</strong>
- <p/>
- <ul>
- <li>
- <strong>Reload a tool's configuration</strong> - allows a new version of a tool to be loaded while the server is running
- </li>
- <p/>
- <li>
- <strong>Profile memory usage</strong> - measures system memory used for certain Galaxy functions
- </li>
- <p/>
- <li>
- <strong>Manage jobs</strong> - displays all jobs that are currently not finished (i.e., their state is new, waiting, queued, or
- running). Administrators are able to cleanly stop long-running jobs.
- </li>
- </ul>
- </li>
- <p/>
- <li><strong>Forms</strong>
- <p/>To be completed
- </li>
- <p/>
- <li><strong>Sequencing Requests</strong>
- <p/>To be completed
- </li>
- <p/>
- <li><strong>Cloud</strong>
- <p/>To be completed
- </li>
-</ul>
-<p/>
-<p><strong>Data Security and Data Libraries</strong></p>
-<p/>
-<strong>Security</strong> - Data security in Galaxy is a new feature, so familiarize yourself with the details which can be found
-here or in our <a href="http://g2.trac.bx.psu.edu/wiki/SecurityFeatures" target="_blank">data security page</a>. The data security
-process incorporates users, groups and roles, and enables the application of certain permissions on datasets, specifically "access"
-and "manage permissions". By default, the "manage permissions" permission is associated with the dataset owner's private role, and
-the "access" permission is not set, making the dataset public. With these default permissions, users should not see any difference
-in the way Galaxy has behaved in the past.
-<ul>
- <li>
- <strong>Users</strong> - registered Galaxy users that have created a Galaxy account. Users can belong to groups and can
- be associated with 1 or more roles. If a user is not authenticated during a Galaxy session, they will not have access to any
- of the security features, and datasets they create during that session will have no permissions applied to them (i.e., they
- will be considered "public", and no one will be allowed to change permissions on them).
- </li>
- <p/>
- <li>
- <strong>Groups</strong> - a set of 0 or more users which are considered members of the group. Groups can be associated with 0
- or more roles, simplifying the process of applying permissions to the data between a select group of users.
- </li>
- <p/>
- <li>
- <strong>Roles</strong> - associate users and groups with specific permissions on datasets. For example, users in groups A and B
- can be associated with role C which gives them the "access" permission on datasets D, E and F. Roles have a type which is currently
- one of the following:
- <ul>
- <li>
- <strong>private</strong> - every user is associated automatically with their own private role. Administrators cannot
- manage private roles.
- </li>
- <li>
- <strong>user</strong> - this is currently not used, but eventually any registered user will be able to create a new role
- and this will be it's type.
- </li>
- <li>
- <strong>sharing</strong> - a role created automatically during a Galaxy session that enables a user to share data with
- another user. This can generally be considered a temporary role.
- </li>
- <li><strong>admin</strong> - a role created by a Galaxy administrator.</li>
- </ul>
- </li>
- <p/>
- <li>
- <strong>Dataset Permissions</strong> - applying the following permissions will to a dataset will result in the behavior described.
- <ul>
- <li>
- <strong>access</strong> - users associated with the role can import this dataset into their history for analysis.
- <p>
- If no roles with the "access" permission are associated with a dataset, the dataset is "public" and may be accessed by anyone
- that can access the data library in which it is contained. See the <strong>Manage data libraries</strong> section above for
- details. Public datasets contained in public data libraries will be accessible to all users (as well as anyone not logged in
- during a Galaxy session) from the list of data libraries displayed when the "Data Libraries" menu item is selected.
- </p>
- <p>
- Associating a dataset with a role that includes the "access" permission restricts the set of users that can access it.
- For example, if 'Role A' includes the "access" permission and 'Role A' is associated with the dataset, only those users
- and groups who are associated with 'Role A' may access the dataset.
- </p>
- <p>
- If multiple roles that include the "access" permission are associated with a dataset, access to the dataset is derived
- from the intersection of the users associated with the roles. For example, if 'Role A' and 'Role B' are associated with
- a dataset, only those users and groups who are associated with both 'Role A' AND 'Role B' may access the dataset. When
- the "access" permission is applied to a dataset, Galaxy checks to make sure that at least 1 user belongs to all groups and
- roles associated with the "access" permission (otherwise the dataset would be restricted from everyone).
- </p>
- <p>
- In order for a user to make a dataset private (i.e., only they can access it), they should associate the dataset with
- their private role (the role identical to their Galaxy user name / email address). Associating additional roles that
- include the "access" permission is not possible, since it would render the dataset inaccessible to everyone.
- <p>
- To make a dataset private to themselves and one or more other users, the user can create a new role and associate the dataset
- with that role, not their "private role". Galaxy makes this easy by telling the user they are about to share a private dataset
- and giving them the option of doing so. If they respond positively, the sharing role is automatically created for them.
- </p>
- <p>
- Private data (data associated with roles that include the "access" permission) must be made public in order to be used
- with external applications like the "view at UCSC" link, or the "Perform genome analysis and prediction with EpiGRAPH"
- tool. Being made publically accessible means removing the association of all roles that include the "access" permission
- from the dataset.
- <p>
- </li>
- <li><strong>manage permissions</strong> - Role members can manage the permissions applied to this dataset</li>
- </ul>
- </li>
-</ul>
-<br/>
diff -r 34eec4d48cc4 -r f6e86e26cfe2 templates/webapps/community/admin/center.mako
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/templates/webapps/community/admin/center.mako Fri Apr 23 15:16:24 2010 -0400
@@ -0,0 +1,41 @@
+<%inherit file="/base.mako"/>
+
+<%def name="title()">Galaxy Administration</%def>
+
+<h2>Administration</h2>
+
+<p>The menu on the left provides the following features</p>
+<ul>
+ <li><strong>Security</strong> - see the <strong>Data Security and Data Libraries</strong> section below for details
+ <p/>
+ <ul>
+ <li>
+ <strong>Manage users</strong> - provides a view of the registered users and all groups and non-private roles associated
+ with each user.
+ </li>
+ <p/>
+ <li>
+ <strong>Manage groups</strong> - provides a view of all groups along with the members of the group and the roles associated with
+ each group (both private and non-private roles). The group names include a link to a page that allows you to manage the users and
+ roles that are associated with the group.
+ </li>
+ <p/>
+ <li>
+ <strong>Manage roles</strong> - provides a view of all non-private roles along with the role type, and the users and groups that
+ are associated with the role. The role names include a link to a page that allows you to manage the users and groups that are associated
+ with the role. The page also includes a view of the data library datasets that are associated with the role and the permissions applied
+ to each dataset.
+ </li>
+ </ul>
+ </li>
+ <p/>
+ <li><strong>Tools</strong>
+ <p/>
+ <ul>
+ <li>
+ <strong>Manage tools</strong> - coming soon...
+ </li>
+ </ul>
+ </li>
+</ul>
+<br/>
diff -r 34eec4d48cc4 -r f6e86e26cfe2 templates/webapps/community/admin/index.mako
--- a/templates/webapps/community/admin/index.mako Fri Apr 23 13:16:40 2010 -0400
+++ b/templates/webapps/community/admin/index.mako Fri Apr 23 15:16:24 2010 -0400
@@ -74,11 +74,19 @@
<span>Security</span>
</div>
<div class="toolSectionBody">
- <div class="toolSectionBg">
- <div class="toolTitle"><a href="${h.url_for( controller='admin', action='users', webapp='community' )}" target="galaxy_main">Manage users</a></div>
- <div class="toolTitle"><a href="${h.url_for( controller='admin', action='groups', webapp='community' )}" target="galaxy_main">Manage groups</a></div>
- <div class="toolTitle"><a href="${h.url_for( controller='admin', action='roles', webapp='community' )}" target="galaxy_main">Manage roles</a></div>
- </div>
+ <div class="toolSectionBg">
+ <div class="toolTitle"><a href="${h.url_for( controller='admin', action='users', webapp='community' )}" target="galaxy_main">Manage users</a></div>
+ <div class="toolTitle"><a href="${h.url_for( controller='admin', action='groups', webapp='community' )}" target="galaxy_main">Manage groups</a></div>
+ <div class="toolTitle"><a href="${h.url_for( controller='admin', action='roles', webapp='community' )}" target="galaxy_main">Manage roles</a></div>
+ </div>
+ </div>
+ <div class="toolSectionTitle">
+ <span>Tools</span>
+ </div>
+ <div class="toolSectionBody">
+ <div class="toolSectionBg">
+ <div class="toolTitle"><a href="${h.url_for( controller='tool_browser', action='browse_tools', webapp='community' )}" target="galaxy_main">Manage tools</a></div>
+ </div>
</div>
</div>
</div>
@@ -87,7 +95,7 @@
<%def name="center_panel()">
<%
- center_url = h.url_for( action='center' )
+ center_url = h.url_for( action='center', webapp='community' )
%>
<iframe name="galaxy_main" id="galaxy_main" frameborder="0" style="position: absolute; width: 100%; height: 100%;" src="${center_url}"> </iframe>
</%def>
diff -r 34eec4d48cc4 -r f6e86e26cfe2 templates/webapps/galaxy/admin/center.mako
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/templates/webapps/galaxy/admin/center.mako Fri Apr 23 15:16:24 2010 -0400
@@ -0,0 +1,187 @@
+<%inherit file="/base.mako"/>
+
+<%def name="title()">Galaxy Administration</%def>
+
+<h2>Administration</h2>
+
+<p>The menu on the left provides the following features</p>
+<ul>
+ <li><strong>Security</strong> - see the <strong>Data Security and Data Libraries</strong> section below for details
+ <p/>
+ <ul>
+ <li>
+ <strong>Manage users</strong> - provides a view of the registered users and all groups and non-private roles associated
+ with each user.
+ </li>
+ <p/>
+ <li>
+ <strong>Manage groups</strong> - provides a view of all groups along with the members of the group and the roles associated with
+ each group (both private and non-private roles). The group names include a link to a page that allows you to manage the users and
+ roles that are associated with the group.
+ </li>
+ <p/>
+ <li>
+ <strong>Manage roles</strong> - provides a view of all non-private roles along with the role type, and the users and groups that
+ are associated with the role. The role names include a link to a page that allows you to manage the users and groups that are associated
+ with the role. The page also includes a view of the data library datasets that are associated with the role and the permissions applied
+ to each dataset.
+ </li>
+ </ul>
+ </li>
+ <p/>
+ <li><strong>Data</strong>
+ <p/>
+ <ul>
+ <li>
+ <strong>Manage data libraries</strong> - Data libraries enable a Galaxy administrator to upload datasets into a data library. Currently,
+ only administrators can create data libraries.
+ <p/>
+ When a data library is first created, it is considered "public" since it will be displayed in the "Data Libraries" view for any user, even
+ those that are not logged in. The Galaxy administrator can restrict access to a data library by associating roles with the data library's
+ "access library" permission. This permission will conservatively override the [dataset] "access" permission for the data library's contained
+ datasets.
+ <p/>
+ For example, if a data library's "access library" permission is associated with Role1 and the data library contains "public" datasets, the
+ data library will still only be displayed to those users that have Role1. However, if the data library's "access library" permission is
+ associated with both Role1 and Role2 and the data library contains datasets whose [dataset] "access" permission is associated with only Role1,
+ then users that have Role2 will be able to access the library, but will not see those contained datasets whose [dataset] "access" permission
+ is associated with only Role1.
+ <p/>
+ In addition to the "access library" permission, permission to perform the following functions on the data library (and it's contents) can
+ be granted to users (a library item is one of: a data library, a library folder, a library dataset).
+ <p/>
+ <ul>
+ <li><strong>add library item</strong> - Role members can add library items to this data library or folder</li>
+ <li><strong>modify library item</strong> - Role members can modify this library item</li>
+ <li><strong>manage library permissions</strong> - Role members can manage permissions applied to this library item</li>
+ </ul>
+ <p/>
+ The default behavior is for no permissions to be applied to a data library item, but applied permissions are inherited downward (with the exception
+ of the "access library" permission, which is only available on the data library itself). Because of this, it is important to set desired permissions
+ on a new data library when it is created. When this is done, new folders and datasets added to the data library will automatically inherit those
+ permissions. In the same way, permissions can be applied to a folder, which will be automatically inherited by all contained datasets and sub-folders.
+ <p/>
+ The "Data Libraries" menu item allows users to access the datasets in a data library as long as they are not restricted from accessing them.
+ Importing a library dataset into a history will not make a copy of the dataset, but will be a "pointer" to the dataset on disk. This
+ approach allows for multiple users to use a single (possibly very large) dataset file.
+ </li>
+ </ul>
+ </li>
+ <p/>
+ <li><strong>Server</strong>
+ <p/>
+ <ul>
+ <li>
+ <strong>Reload a tool's configuration</strong> - allows a new version of a tool to be loaded while the server is running
+ </li>
+ <p/>
+ <li>
+ <strong>Profile memory usage</strong> - measures system memory used for certain Galaxy functions
+ </li>
+ <p/>
+ <li>
+ <strong>Manage jobs</strong> - displays all jobs that are currently not finished (i.e., their state is new, waiting, queued, or
+ running). Administrators are able to cleanly stop long-running jobs.
+ </li>
+ </ul>
+ </li>
+ <p/>
+ <li><strong>Forms</strong>
+ <p/>To be completed
+ </li>
+ <p/>
+ <li><strong>Sequencing Requests</strong>
+ <p/>To be completed
+ </li>
+ <p/>
+ <li><strong>Cloud</strong>
+ <p/>To be completed
+ </li>
+</ul>
+<p/>
+<p><strong>Data Security and Data Libraries</strong></p>
+<p/>
+<strong>Security</strong> - Data security in Galaxy is a new feature, so familiarize yourself with the details which can be found
+here or in our <a href="http://g2.trac.bx.psu.edu/wiki/SecurityFeatures" target="_blank">data security page</a>. The data security
+process incorporates users, groups and roles, and enables the application of certain permissions on datasets, specifically "access"
+and "manage permissions". By default, the "manage permissions" permission is associated with the dataset owner's private role, and
+the "access" permission is not set, making the dataset public. With these default permissions, users should not see any difference
+in the way Galaxy has behaved in the past.
+<ul>
+ <li>
+ <strong>Users</strong> - registered Galaxy users that have created a Galaxy account. Users can belong to groups and can
+ be associated with 1 or more roles. If a user is not authenticated during a Galaxy session, they will not have access to any
+ of the security features, and datasets they create during that session will have no permissions applied to them (i.e., they
+ will be considered "public", and no one will be allowed to change permissions on them).
+ </li>
+ <p/>
+ <li>
+ <strong>Groups</strong> - a set of 0 or more users which are considered members of the group. Groups can be associated with 0
+ or more roles, simplifying the process of applying permissions to the data between a select group of users.
+ </li>
+ <p/>
+ <li>
+ <strong>Roles</strong> - associate users and groups with specific permissions on datasets. For example, users in groups A and B
+ can be associated with role C which gives them the "access" permission on datasets D, E and F. Roles have a type which is currently
+ one of the following:
+ <ul>
+ <li>
+ <strong>private</strong> - every user is associated automatically with their own private role. Administrators cannot
+ manage private roles.
+ </li>
+ <li>
+ <strong>user</strong> - this is currently not used, but eventually any registered user will be able to create a new role
+ and this will be it's type.
+ </li>
+ <li>
+ <strong>sharing</strong> - a role created automatically during a Galaxy session that enables a user to share data with
+ another user. This can generally be considered a temporary role.
+ </li>
+ <li><strong>admin</strong> - a role created by a Galaxy administrator.</li>
+ </ul>
+ </li>
+ <p/>
+ <li>
+ <strong>Dataset Permissions</strong> - applying the following permissions will to a dataset will result in the behavior described.
+ <ul>
+ <li>
+ <strong>access</strong> - users associated with the role can import this dataset into their history for analysis.
+ <p>
+ If no roles with the "access" permission are associated with a dataset, the dataset is "public" and may be accessed by anyone
+ that can access the data library in which it is contained. See the <strong>Manage data libraries</strong> section above for
+ details. Public datasets contained in public data libraries will be accessible to all users (as well as anyone not logged in
+ during a Galaxy session) from the list of data libraries displayed when the "Data Libraries" menu item is selected.
+ </p>
+ <p>
+ Associating a dataset with a role that includes the "access" permission restricts the set of users that can access it.
+ For example, if 'Role A' includes the "access" permission and 'Role A' is associated with the dataset, only those users
+ and groups who are associated with 'Role A' may access the dataset.
+ </p>
+ <p>
+ If multiple roles that include the "access" permission are associated with a dataset, access to the dataset is derived
+ from the intersection of the users associated with the roles. For example, if 'Role A' and 'Role B' are associated with
+ a dataset, only those users and groups who are associated with both 'Role A' AND 'Role B' may access the dataset. When
+ the "access" permission is applied to a dataset, Galaxy checks to make sure that at least 1 user belongs to all groups and
+ roles associated with the "access" permission (otherwise the dataset would be restricted from everyone).
+ </p>
+ <p>
+ In order for a user to make a dataset private (i.e., only they can access it), they should associate the dataset with
+ their private role (the role identical to their Galaxy user name / email address). Associating additional roles that
+ include the "access" permission is not possible, since it would render the dataset inaccessible to everyone.
+ <p>
+ To make a dataset private to themselves and one or more other users, the user can create a new role and associate the dataset
+ with that role, not their "private role". Galaxy makes this easy by telling the user they are about to share a private dataset
+ and giving them the option of doing so. If they respond positively, the sharing role is automatically created for them.
+ </p>
+ <p>
+ Private data (data associated with roles that include the "access" permission) must be made public in order to be used
+ with external applications like the "view at UCSC" link, or the "Perform genome analysis and prediction with EpiGRAPH"
+ tool. Being made publically accessible means removing the association of all roles that include the "access" permission
+ from the dataset.
+ <p>
+ </li>
+ <li><strong>manage permissions</strong> - Role members can manage the permissions applied to this dataset</li>
+ </ul>
+ </li>
+</ul>
+<br/>
diff -r 34eec4d48cc4 -r f6e86e26cfe2 templates/webapps/galaxy/admin/index.mako
--- a/templates/webapps/galaxy/admin/index.mako Fri Apr 23 13:16:40 2010 -0400
+++ b/templates/webapps/galaxy/admin/index.mako Fri Apr 23 15:16:24 2010 -0400
@@ -133,7 +133,7 @@
<%def name="center_panel()">
<%
- center_url = h.url_for( action='center' )
+ center_url = h.url_for( action='center', webapp='galaxy' )
%>
<iframe name="galaxy_main" id="galaxy_main" frameborder="0" style="position: absolute; width: 100%; height: 100%;" src="${center_url}"> </iframe>
</%def>
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/34eec4d48cc4
changeset: 3685:34eec4d48cc4
user: rc
date: Fri Apr 23 13:16:40 2010 -0400
description:
lims: add tests for request_type permissions
diffstat:
test/base/twilltestcase.py | 15 ++
test/functional/test_forms_and_requests.py | 214 ++++++++++++++++++----------
2 files changed, 151 insertions(+), 78 deletions(-)
diffs (337 lines):
diff -r a77ec2944999 -r 34eec4d48cc4 test/base/twilltestcase.py
--- a/test/base/twilltestcase.py Fri Apr 23 11:30:16 2010 -0400
+++ b/test/base/twilltestcase.py Fri Apr 23 13:16:40 2010 -0400
@@ -1526,6 +1526,21 @@
tc.fv("1", "state_desc_%i" % index, state[1])
tc.submit( "save_request_type" )
self.check_page_for_string( "Request type <b>%s</b> has been created" % name )
+ def request_type_permissions( self, request_type_id, request_type_name, role_ids_str, permissions_in, permissions_out ):
+ # role_ids_str must be a comma-separated string of role ids
+ url = "requests_admin/manage_request_types?operation=permissions&id=%s&update_roles_button=Save" % ( request_type_id )
+ for po in permissions_out:
+ key = '%s_out' % po
+ url ="%s&%s=%s" % ( url, key, role_ids_str )
+ for pi in permissions_in:
+ key = '%s_in' % pi
+ url ="%s&%s=%s" % ( url, key, role_ids_str )
+ self.home()
+ self.visit_url( "%s/%s" % ( self.url, url ) )
+ print url
+ check_str = "Permissions updated for request type '%s'" % request_type_name
+ self.check_page_for_string( check_str )
+ self.home()
def create_request( self, request_type_id, name, desc, fields ):
self.home()
self.visit_url( "%s/requests/new?create=True&select_request_type=%i" % ( self.url,
diff -r a77ec2944999 -r 34eec4d48cc4 test/functional/test_forms_and_requests.py
--- a/test/functional/test_forms_and_requests.py Fri Apr 23 11:30:16 2010 -0400
+++ b/test/functional/test_forms_and_requests.py Fri Apr 23 13:16:40 2010 -0400
@@ -2,6 +2,7 @@
from galaxy.model.orm import *
from galaxy.model.mapping import context as sa_session
from base.twilltestcase import *
+from base.test_db_util import *
not_logged_in_as_admin_security_msg = 'You must be logged in as an administrator to access this feature.'
logged_in_as_admin_security_msg = 'You must be an administrator to access this feature.'
@@ -37,7 +38,80 @@
class TestFormsAndRequests( TwillTestCase ):
- def test_000_create_form( self ):
+ def test_000_initiate_users( self ):
+ """Ensuring all required user accounts exist"""
+ self.logout()
+ self.login( email='test1(a)bx.psu.edu', username='regular-user1' )
+ global regular_user1
+ regular_user1 = get_user( 'test1(a)bx.psu.edu' )
+ assert regular_user1 is not None, 'Problem retrieving user with email "test1(a)bx.psu.edu" from the database'
+ global regular_user1_private_role
+ regular_user1_private_role = get_private_role( regular_user1 )
+ self.logout()
+ self.login( email='test2(a)bx.psu.edu', username='regular-user2' )
+ global regular_user2
+ regular_user2 = get_user( 'test2(a)bx.psu.edu' )
+ assert regular_user2 is not None, 'Problem retrieving user with email "test2(a)bx.psu.edu" from the database'
+ global regular_user2_private_role
+ regular_user2_private_role = get_private_role( regular_user2 )
+ self.logout()
+ self.login( email='test3(a)bx.psu.edu', username='regular-user3' )
+ global regular_user3
+ regular_user3 = get_user( 'test3(a)bx.psu.edu' )
+ assert regular_user3 is not None, 'Problem retrieving user with email "test3(a)bx.psu.edu" from the database'
+ global regular_user3_private_role
+ regular_user3_private_role = get_private_role( regular_user3 )
+ self.logout()
+ self.login( email='test(a)bx.psu.edu', username='admin-user' )
+ global admin_user
+ admin_user = get_user( 'test(a)bx.psu.edu' )
+ assert admin_user is not None, 'Problem retrieving user with email "test(a)bx.psu.edu" from the database'
+ global admin_user_private_role
+ admin_user_private_role = get_private_role( admin_user )
+ def test_005_create_required_groups_and_roles( self ):
+ """Testing creating all required groups and roles for this script"""
+ # Logged in as admin_user
+ # Create role_one
+ name = 'Role One'
+ description = "This is Role One's description"
+ user_ids = [ str( admin_user.id ), str( regular_user1.id ), str( regular_user3.id ) ]
+ self.create_role( name=name,
+ description=description,
+ in_user_ids=user_ids,
+ in_group_ids=[],
+ create_group_for_role='no',
+ private_role=admin_user.email )
+ # Get the role object for later tests
+ global role_one
+ role_one = get_role_by_name( name )
+ # Create group_one
+ name = 'Group One'
+ self.create_group( name=name, in_user_ids=[ str( regular_user1.id ) ], in_role_ids=[ str( role_one.id ) ] )
+ # Get the group object for later tests
+ global group_one
+ group_one = get_group_by_name( name )
+ assert group_one is not None, 'Problem retrieving group named "Group One" from the database'
+ # NOTE: To get this to work with twill, all select lists on the ~/admin/role page must contain at least
+ # 1 option value or twill throws an exception, which is: ParseError: OPTION outside of SELECT
+ # Due to this bug in twill, we create the role, we bypass the page and visit the URL in the
+ # associate_users_and_groups_with_role() method.
+ #
+ #create role_two
+ name = 'Role Two'
+ description = 'This is Role Two'
+ user_ids = [ str( admin_user.id ) ]
+ group_ids = [ str( group_one.id ) ]
+ private_role = admin_user.email
+ self.create_role( name=name,
+ description=description,
+ in_user_ids=user_ids,
+ in_group_ids=group_ids,
+ private_role=private_role )
+ # Get the role object for later tests
+ global role_two
+ role_two = get_role_by_name( name )
+ assert role_two is not None, 'Problem retrieving role named "Role Two" from the database'
+ def test_010_create_form( self ):
"""Testing creating a new form and editing it"""
self.logout()
self.login( email='test(a)bx.psu.edu' )
@@ -58,7 +132,7 @@
self.check_page_for_string( new_name )
self.check_page_for_string( new_desc )
form_one_name = new_name
- def test_005_add_form_fields( self ):
+ def test_015_add_form_fields( self ):
"""Testing adding fields to a form definition"""
fields = [dict(name='Test field name one',
desc='Test field description one',
@@ -78,7 +152,7 @@
field_index=len(form_one.fields), fields=fields)
form_one_latest = get_latest_form(form_one_name)
assert len(form_one_latest.fields) == len(form_one.fields)+len(fields)
- def test_015_create_sample_form( self ):
+ def test_020_create_sample_form( self ):
"""Testing creating another form (for samples)"""
global form_two_name
desc = "This is Form Two's description"
@@ -90,7 +164,7 @@
self.check_page_for_string( form_two_name )
self.check_page_for_string( desc )
self.check_page_for_string( formtype )
- def test_020_create_request_type( self ):
+ def test_025_create_request_type( self ):
"""Testing creating a new requestype"""
request_form = get_latest_form(form_one_name)
sample_form = get_latest_form(form_two_name)
@@ -102,93 +176,77 @@
.order_by( desc( galaxy.model.RequestType.table.c.create_time ) ) \
.first()
assert request_type is not None, 'Problem retrieving request type named "%s" from the database' % request_type_name
- def test_025_create_address_and_library( self ):
+ # Set permissions
+ permissions_in = [ k for k, v in galaxy.model.RequestType.permitted_actions.items() ]
+ permissions_out = []
+ # Role one members are: admin_user, regular_user1, regular_user3. Each of these users will be permitted for
+ # REQUEST_TYPE_ACCESS on this request_type
+ self.request_type_permissions(self.security.encode_id( request_type.id ),
+ request_type.name,
+ str( role_one.id ),
+ permissions_in,
+ permissions_out )
+ # Make sure the request_type is not accessible by regular_user2 since regular_user2 does not have Role1.
+ self.logout()
+ self.login( email=regular_user2.email )
+ self.visit_url( '%s/requests/new?create=True&select_request_type=%i' % (self.url, request_type.id) )
+ try:
+ self.check_page_for_string( 'There are no request types created for a new request.' )
+ raise AssertionError, 'The request_type %s is accessible by %s when it should be restricted' % ( request_type.name, regular_user2.email )
+ except:
+ pass
+ self.logout()
+ self.login( email=admin_user.email )
+
+ def test_030_create_address_and_library( self ):
"""Testing address & library creation"""
- # first create a regular user
- self.logout()
- self.login( email='test1(a)bx.psu.edu', username='regular-user1' )
- self.logout()
- self.login( email='test(a)bx.psu.edu' )
# first create a library for the request so that it can be submitted later
- lib_name = 'TestLib001'
- self.create_library( lib_name, '' )
- self.visit_page( 'library_admin/browse_libraries' )
- self.check_page_for_string( lib_name )
+ name = "TestLib001"
+ description = "TestLib001 description"
+ synopsis = "TestLib001 synopsis"
+ self.create_library( name=name, description=description, synopsis=synopsis )
# Get the library object for later tests
global library_one
- library_one = sa_session.query( galaxy.model.Library ) \
- .filter( and_( galaxy.model.Library.table.c.name==lib_name,
- galaxy.model.Library.table.c.deleted==False ) ) \
- .first()
- assert library_one is not None, 'Problem retrieving library named "%s" from the database' % lib_name
- global admin_user
- admin_user = sa_session.query( galaxy.model.User ) \
- .filter( galaxy.model.User.table.c.email=='test(a)bx.psu.edu' ) \
- .first()
- assert admin_user is not None, 'Problem retrieving user with email "test(a)bx.psu.edu" from the database'
- # Get the admin user's private role for later use
- global admin_user_private_role
- admin_user_private_role = None
- for role in admin_user.all_roles():
- if role.name == admin_user.email and role.description == 'Private Role for %s' % admin_user.email:
- admin_user_private_role = role
- break
- if not admin_user_private_role:
- raise AssertionError( "Private role not found for user '%s'" % admin_user.email )
- global regular_user1
- regular_user1 = sa_session.query( galaxy.model.User ) \
- .filter( galaxy.model.User.table.c.email=='test1(a)bx.psu.edu' ) \
- .first()
- assert regular_user1 is not None, 'Problem retrieving user with email "test1(a)bx.psu.edu" from the database'
- # Get the regular user's private role for later use
- global regular_user1_private_role
- regular_user1_private_role = None
- for role in regular_user1.all_roles():
- if role.name == regular_user1.email and role.description == 'Private Role for %s' % regular_user1.email:
- regular_user1_private_role = role
- break
- if not regular_user1_private_role:
- raise AssertionError( "Private role not found for user '%s'" % regular_user1.email )
- # Set permissions on the library, sort for later testing
+ library_one = get_library( name, description, synopsis )
+ assert library_one is not None, 'Problem retrieving library named "%s" from the database' % name
+ # Make sure library_one is public
+ assert 'access library' not in [ a.action for a in library_one.actions ], 'Library %s is not public when first created' % library_one.name
+ # Set permissions on the library, sort for later testing.
permissions_in = [ k for k, v in galaxy.model.Library.permitted_actions.items() ]
permissions_out = []
- name = 'Role for testing forms'
- description = "This is Role Ones description"
- user_ids=[ str( admin_user.id ), str( regular_user1.id ) ]
- self.create_role( name=name,
- description=description,
- in_user_ids=user_ids,
- in_group_ids=[],
- create_group_for_role='yes',
- private_role=admin_user.email )
- # Get the role object for later tests
- global role_one
- role_one = sa_session.query( galaxy.model.Role ).filter( galaxy.model.Role.table.c.name==name ).first()
- assert role_one is not None, 'Problem retrieving role named "Role for testing forms" from the database'
- # Role one members are: admin_user, regular_user1. Each of these users will be permitted to
- # LIBRARY_ADD, LIBRARY_MODIFY, LIBRARY_MANAGE for library items.
+ # Role one members are: admin_user, regular_user1, regular_user3. Each of these users will be permitted for
+ # LIBRARY_ACCESS, LIBRARY_ADD, LIBRARY_MODIFY, LIBRARY_MANAGE on this library and it's contents.
self.library_permissions( self.security.encode_id( library_one.id ),
library_one.name,
str( role_one.id ),
permissions_in,
permissions_out )
- # create a folder in the library
+ # Make sure the library is accessible by admin_user
+ self.visit_url( '%s/library/browse_libraries' % self.url )
+ self.check_page_for_string( library_one.name )
+ # Make sure the library is not accessible by regular_user2 since regular_user2 does not have Role1.
+ self.logout()
+ self.login( email=regular_user2.email )
+ self.visit_url( '%s/library/browse_libraries' % self.url )
+ try:
+ self.check_page_for_string( library_one.name )
+ raise AssertionError, 'Library %s is accessible by %s when it should be restricted' % ( library_one.name, regular_user2.email )
+ except:
+ pass
+ self.logout()
+ self.login( email=admin_user.email )
+ # create folder
root_folder = library_one.root_folder
- name = "Folder One"
+ name = "Root Folder's Folder One"
+ description = "This is the root folder's Folder One"
self.add_folder( 'library_admin',
self.security.encode_id( library_one.id ),
self.security.encode_id( root_folder.id ),
name=name,
- description='' )
+ description=description )
global folder_one
- folder_one = sa_session.query( galaxy.model.LibraryFolder ) \
- .filter( and_( galaxy.model.LibraryFolder.table.c.parent_id==root_folder.id,
- galaxy.model.LibraryFolder.table.c.name==name ) ) \
- .first()
+ folder_one = get_folder( root_folder.id, name, description )
assert folder_one is not None, 'Problem retrieving library folder named "%s" from the database' % name
- self.home()
- self.visit_url( '%s/library_common/browse_library?cntrller=library_admin&id=%s' % ( self.url, self.security.encode_id( library_one.id ) ) )
- self.check_page_for_string( name )
# create address
self.logout()
self.login( email='test1(a)bx.psu.edu', username='regular-user1' )
@@ -202,7 +260,7 @@
.filter( and_( galaxy.model.UserAddress.table.c.desc==address1[ 'short_desc' ],
galaxy.model.UserAddress.table.c.deleted==False ) ) \
.first()
- def test_030_create_request( self ):
+ def test_035_create_request( self ):
"""Testing creating, editing and submitting a request as a regular user"""
# login as a regular user
self.logout()
@@ -240,7 +298,7 @@
# check if the request's state is now set to 'submitted'
assert request_one.state is not request_one.states.SUBMITTED, "The state of the request '%s' should be set to '%s'" \
% ( request_one.name, request_one.states.SUBMITTED )
- def test_035_request_lifecycle( self ):
+ def test_040_request_lifecycle( self ):
"""Testing request lifecycle as it goes through all the states"""
# goto admin manage requests page
self.logout()
@@ -264,7 +322,7 @@
self.check_request_grid(state='Complete', request_name=request_one.name)
assert request_one.state is not request_one.states.COMPLETE, "The state of the request '%s' should be set to '%s'" \
% ( request_one.name, request_one.states.COMPLETE )
- def test_040_admin_create_request_on_behalf_of_regular_user( self ):
+ def test_045_admin_create_request_on_behalf_of_regular_user( self ):
"""Testing creating and submitting a request as an admin on behalf of a regular user"""
self.logout()
self.login( email='test(a)bx.psu.edu' )
@@ -301,7 +359,7 @@
# check if both the requests is showing in the 'All' filter
self.check_request_admin_grid(state='All', request_name=request_one.name)
self.check_request_admin_grid(state='All', request_name=request_two.name)
- def test_045_reject_request( self ):
+ def test_050_reject_request( self ):
'''Testing rejecting a request'''
self.logout()
self.login( email='test(a)bx.psu.edu' )
@@ -312,7 +370,7 @@
# check if the request's state is now set to 'submitted'
assert request_two.state is not request_two.states.REJECTED, "The state of the request '%s' should be set to '%s'" \
% ( request_two.name, request_two.states.REJECTED )
- def test_050_reset_data_for_later_test_runs( self ):
+ def test_055_reset_data_for_later_test_runs( self ):
"""Reseting data to enable later test runs to pass"""
# TODO: RC: add whatever is missing from this method that should be marked
# deleted or purged so that later test runs will correctly test features if the
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/a77ec2944999
changeset: 3684:a77ec2944999
user: Nate Coraor <nate(a)bx.psu.edu>
date: Fri Apr 23 11:30:16 2010 -0400
description:
Community upload functionality
diffstat:
community_datatypes_conf.xml.sample | 9 +
community_wsgi.ini.sample | 2 +-
lib/galaxy/model/orm/__init__.py | 1 +
lib/galaxy/web/base/controller.py | 2 +-
lib/galaxy/webapps/community/app.py | 4 +
lib/galaxy/webapps/community/config.py | 1 +
lib/galaxy/webapps/community/controllers/tool_browser.py | 48 ++-
lib/galaxy/webapps/community/controllers/upload.py | 63 ++++
lib/galaxy/webapps/community/datatypes/__init__.py | 145 ++++++++++
lib/galaxy/webapps/community/model/__init__.py | 124 ++-----
lib/galaxy/webapps/community/model/mapping.py | 45 +-
lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py | 24 +-
templates/webapps/community/tool/edit_tool.mako | 73 +++++
templates/webapps/community/upload/upload.mako | 66 ++++
14 files changed, 480 insertions(+), 127 deletions(-)
diffs (832 lines):
diff -r 742fa2afcad9 -r a77ec2944999 community_datatypes_conf.xml.sample
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/community_datatypes_conf.xml.sample Fri Apr 23 11:30:16 2010 -0400
@@ -0,0 +1,9 @@
+<?xml version="1.0"?>
+<datatypes>
+ <registration>
+ <datatype extension="tool" type="galaxy.webapps.community.datatypes:Tool" model="galaxy.webapps.community.model:Tool"/>
+ </registration>
+ <sniffers>
+ <sniffer type="galaxy.webapps.community.datatypes:Tool"/>
+ </sniffers>
+</datatypes>
diff -r 742fa2afcad9 -r a77ec2944999 community_wsgi.ini.sample
--- a/community_wsgi.ini.sample Fri Apr 23 11:14:26 2010 -0400
+++ b/community_wsgi.ini.sample Fri Apr 23 11:30:16 2010 -0400
@@ -22,7 +22,7 @@
#database_connection = postgres:///community_test?host=/var/run/postgresql
# Where dataset files are saved
-file_path = database/files
+file_path = database/community_files
# Temporary storage for additional datasets, this should be shared through the cluster
new_file_path = database/tmp
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/model/orm/__init__.py
--- a/lib/galaxy/model/orm/__init__.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/model/orm/__init__.py Fri Apr 23 11:30:16 2010 -0400
@@ -3,5 +3,6 @@
from sqlalchemy import *
from sqlalchemy.orm import *
+import sqlalchemy.exc
from sqlalchemy.ext.orderinglist import ordering_list
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/web/base/controller.py Fri Apr 23 11:30:16 2010 -0400
@@ -304,7 +304,7 @@
class ControllerUnavailable( Exception ):
pass
-class Admin():
+class Admin( object ):
# Override these
user_list_grid = None
role_list_grid = None
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/app.py
--- a/lib/galaxy/webapps/community/app.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/webapps/community/app.py Fri Apr 23 11:30:16 2010 -0400
@@ -1,5 +1,6 @@
import sys, config
import galaxy.webapps.community.model
+import galaxy.webapps.community.datatypes
from galaxy.web import security
from galaxy.tags.tag_handler import CommunityTagHandler
@@ -11,6 +12,9 @@
self.config = config.Configuration( **kwargs )
self.config.check()
config.configure_logging( self.config )
+ # Set up datatypes registry
+ self.datatypes_registry = galaxy.webapps.community.datatypes.Registry( self.config.root, self.config.datatypes_config )
+ galaxy.model.set_datatypes_registry( self.datatypes_registry )
# Determine the database url
if self.config.database_connection:
db_url = self.config.database_connection
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/config.py
--- a/lib/galaxy/webapps/community/config.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/webapps/community/config.py Fri Apr 23 11:30:16 2010 -0400
@@ -61,6 +61,7 @@
self.screencasts_url = kwargs.get( 'screencasts_url', None )
self.log_events = False
self.cloud_controller_instance = False
+ self.datatypes_config = kwargs.get( 'datatypes_config_file', 'community_datatypes_conf.xml' )
# Parse global_conf and save the parser
global_conf = kwargs.get( 'global_conf', None )
global_conf_parser = ConfigParser.ConfigParser()
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/controllers/tool_browser.py
--- a/lib/galaxy/webapps/community/controllers/tool_browser.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/webapps/community/controllers/tool_browser.py Fri Apr 23 11:30:16 2010 -0400
@@ -1,5 +1,4 @@
import sys, os, operator, string, shutil, re, socket, urllib, time, logging
-from cgi import escape, FieldStorage
from galaxy.web.base.controller import *
from galaxy.webapps.community import model
@@ -27,18 +26,14 @@
title = "Tools"
model_class = model.Tool
template='/webapps/community/tool/grid.mako'
- default_sort_key = "category"
+ default_sort_key = "name"
columns = [
NameColumn( "Name",
key="name",
model_class=model.Tool,
+ link=( lambda item: dict( operation="Edit Tool", id=item.id, webapp="community" ) ),
attach_popup=False,
filterable="advanced" ),
- CategoryColumn( "Category",
- key="category",
- model_class=model.Tool,
- attach_popup=False,
- filterable="advanced" ),
# Columns that are valid for filtering but are not visible.
grids.DeletedColumn( "Deleted", key="deleted", visible=False, filterable="advanced" )
]
@@ -48,7 +43,7 @@
visible=False,
filterable="standard" ) )
global_actions = [
- grids.GridAction( "Upload tool", dict( controller='tool_browwser', action='upload' ) )
+ grids.GridAction( "Upload tool", dict( controller='upload', action='upload', type='tool' ) )
]
operations = [
grids.GridOperation( "View versions", condition=( lambda item: not item.deleted ), allow_multiple=False )
@@ -57,7 +52,7 @@
grids.GridColumnFilter( "Deleted", args=dict( deleted=True ) ),
grids.GridColumnFilter( "All", args=dict( deleted='All' ) )
]
- default_filter = dict( name="All", category="All", deleted="False" )
+ default_filter = dict( name="All", deleted="False" )
num_rows_per_page = 50
preserve_state = False
use_paging = True
@@ -84,6 +79,10 @@
return trans.response.send_redirect( web.url_for( controller='tool_browser',
action='browse_tool',
**kwargs ) )
+ elif operation == "edit tool":
+ return trans.response.send_redirect( web.url_for( controller='tool_browser',
+ action='edit_tool',
+ **kwargs ) )
# Render the list view
return self.tool_list_grid( trans, **kwargs )
@web.expose
@@ -96,5 +95,32 @@
message=message,
status=status )
@web.expose
- def upload( self, trans, **kwargs ):
- pass
+ def edit_tool( self, trans, id=None, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ # Get the tool
+ tool = None
+ if id is not None:
+ encoded_id = id
+ id = trans.app.security.decode_id( id )
+ tool = trans.sa_session.query( trans.model.Tool ).get( id )
+ if tool is None:
+ return trans.response.send_redirect( web.url_for( controller='tool_browser',
+ action='browse_tools',
+ message='Please select a Tool to edit (the tool ID provided was invalid)',
+ status='error' ) )
+ if params.save_button and ( params.file_data != '' or params.url != '' ):
+ # TODO: call the upload method in the upload controller.
+ message = 'Uploading new version not implemented'
+ status = 'error'
+ elif params.save_button:
+ tool.user_description = params.description
+ tool.category = params.category
+ categories = trans.sa_session.query( trans.model.Category ).order_by( trans.model.Category.table.c.name ).all()
+ return trans.fill_template( '/webapps/community/tool/edit_tool.mako',
+ encoded_id = encoded_id,
+ tool=tool,
+ categories=categories,
+ message=message,
+ status=status )
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/controllers/upload.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/webapps/community/controllers/upload.py Fri Apr 23 11:30:16 2010 -0400
@@ -0,0 +1,63 @@
+import sys, os, shutil, logging, urllib2
+
+from galaxy.web.base.controller import *
+from galaxy.web.framework.helpers import time_ago, iff, grids
+from galaxy.model.orm import *
+from galaxy.webapps.community import datatypes
+
+log = logging.getLogger( __name__ )
+
+# States for passing messages
+SUCCESS, INFO, WARNING, ERROR = "done", "info", "warning", "error"
+
+class UploadController( BaseController ):
+
+ @web.expose
+ def upload( self, trans, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ uploaded_file = None
+ if params.file_data == '' and params.url.strip() == '':
+ message = 'No files were entered on the upload form.'
+ status = 'error'
+ elif params.file_data == '':
+ try:
+ uploaded_file = urllib2.urlopen( params.url.strip() )
+ except ( ValueError, urllib2.HTTPError ), e:
+ message = 'An error occurred trying to retrieve the URL entered on the upload form: %s' % e
+ status = 'error'
+ except urllib2.URLError, e:
+ message = 'An error occurred trying to retrieve the URL entered on the upload form: %s' % e.reason
+ status = 'error'
+ elif params.file_data not in ( '', None ):
+ uploaded_file = params.file_data.file
+ if params.upload_button and uploaded_file:
+ datatype = trans.app.datatypes_registry.get_datatype_by_extension( params.upload_type )
+ if datatype is None:
+ message = 'An unknown filetype was selected. This should not be possble, please report the error.'
+ status = 'error'
+ else:
+ try:
+ meta = datatype.verify( uploaded_file )
+ meta.user = trans.user
+ obj = datatype.create_model_object( meta )
+ trans.sa_session.add( obj )
+ trans.sa_session.flush()
+ try:
+ os.link( uploaded_file.name, obj.file_name )
+ except OSError:
+ shutil.copy( uploaded_file.name, obj.file_name )
+ message = 'Uploaded %s' % meta.message
+ except datatypes.DatatypeVerificationError, e:
+ message = str( e )
+ status = 'error'
+ except sqlalchemy.exc.IntegrityError:
+ message = 'A tool with the same ID already exists. If you are trying to update this tool to a new version, please ... ??? ... Otherwise, please choose a new ID.'
+ status = 'error'
+ uploaded_file.close()
+ selected_upload_type = params.get( 'type', 'tool' )
+ return trans.fill_template( '/webapps/community/upload/upload.mako', message=message,
+ status=status,
+ selected_upload_type=selected_upload_type,
+ upload_types=trans.app.datatypes_registry.get_datatypes_for_select_list() )
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/datatypes/__init__.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/webapps/community/datatypes/__init__.py Fri Apr 23 11:30:16 2010 -0400
@@ -0,0 +1,145 @@
+import sys, logging, tarfile
+import galaxy.util
+from galaxy.util.bunch import Bunch
+
+log = logging.getLogger( __name__ )
+
+if sys.version_info[:2] == ( 2, 4 ):
+ from galaxy import eggs
+ eggs.require( 'ElementTree' )
+ from elementtree import ElementTree
+else:
+ from xml.etree import ElementTree
+
+class DatatypeVerificationError( Exception ):
+ pass
+
+class Registry( object ):
+ def __init__( self, root_dir=None, config=None ):
+ self.datatypes_by_extension = {}
+ self.sniff_order = []
+ if root_dir and config:
+ # Parse datatypes_conf.xml
+ tree = galaxy.util.parse_xml( config )
+ root = tree.getroot()
+ # Load datatypes and converters from config
+ log.debug( 'Loading datatypes from %s' % config )
+ registration = root.find( 'registration' )
+ for elem in registration.findall( 'datatype' ):
+ try:
+ extension = elem.get( 'extension', None )
+ dtype = elem.get( 'type', None )
+ model_object = elem.get( 'model', None )
+ if extension and dtype:
+ fields = dtype.split( ':' )
+ datatype_module = fields[0]
+ datatype_class = fields[1]
+ fields = datatype_module.split( '.' )
+ module = __import__( fields.pop(0) )
+ for mod in fields:
+ module = getattr( module, mod )
+ self.datatypes_by_extension[extension] = getattr( module, datatype_class )()
+ log.debug( 'Loaded datatype: %s' % dtype )
+ if model_object:
+ model_module, model_class = model_object.split( ':' )
+ fields = model_module.split( '.' )
+ module = __import__( fields.pop(0) )
+ for mod in fields:
+ module = getattr( module, mod )
+ self.datatypes_by_extension[extension].model_object = getattr( module, model_class )
+ log.debug( 'Added model class: %s to datatype: %s' % ( model_class, dtype ) )
+ except Exception, e:
+ log.warning( 'Error loading datatype "%s", problem: %s' % ( extension, str( e ) ) )
+ # Load datatype sniffers from the config
+ sniff_order = []
+ sniffers = root.find( 'sniffers' )
+ for elem in sniffers.findall( 'sniffer' ):
+ dtype = elem.get( 'type', None )
+ if dtype:
+ sniff_order.append( dtype )
+ for dtype in sniff_order:
+ try:
+ fields = dtype.split( ":" )
+ datatype_module = fields[0]
+ datatype_class = fields[1]
+ fields = datatype_module.split( "." )
+ module = __import__( fields.pop(0) )
+ for mod in fields:
+ module = getattr( module, mod )
+ aclass = getattr( module, datatype_class )()
+ included = False
+ for atype in self.sniff_order:
+ if not issubclass( atype.__class__, aclass.__class__ ) and isinstance( atype, aclass.__class__ ):
+ included = True
+ break
+ if not included:
+ self.sniff_order.append( aclass )
+ log.debug( 'Loaded sniffer for datatype: %s' % dtype )
+ except Exception, exc:
+ log.warning( 'Error appending datatype %s to sniff_order, problem: %s' % ( dtype, str( exc ) ) )
+ def get_datatype_by_extension( self, ext ):
+ return self.datatypes_by_extension.get( ext, None )
+ def get_datatypes_for_select_list( self ):
+ rval = []
+ for ext, datatype in self.datatypes_by_extension.items():
+ rval.append( ( ext, datatype.select_name ) )
+ return rval
+ def sniff( self, fname ):
+ for datatype in sniff_order:
+ try:
+ datatype.sniff( fname )
+ return datatype.file_ext
+ except:
+ pass
+
+class Tool( object ):
+ select_name = 'Tool'
+ def __init__( self, model_object=None ):
+ self.model_object = model_object
+ def verify( self, file ):
+ msg = ''
+ try:
+ tar = tarfile.TarFile( fileobj = file )
+ except tarfile.ReadError:
+ raise DatatypeVerificationError( 'The tool file is not a readable tar file' )
+ xml_names = filter( lambda x: x.lower().endswith( '.xml' ), tar.getnames() )
+ if not xml_names:
+ raise DatatypeVerificationError( 'The tool file does not contain an XML file' )
+ for xml_name in xml_names:
+ try:
+ tree = ElementTree.parse( tar.extractfile( xml_name ) )
+ root = tree.getroot()
+ except:
+ log.exception( 'fail:' )
+ continue
+ if root.tag == 'tool':
+ rval = Bunch()
+ try:
+ rval.id = root.attrib['id']
+ rval.name = root.attrib['name']
+ rval.version = root.attrib['version']
+ except KeyError, e:
+ raise DatatypeVerificationError( 'Tool XML file does not conform to the specification. Missing required <tool> tag attribute: %s' % e )
+ rval.description = None
+ desc_tag = root.find( 'description' )
+ if desc_tag is not None:
+ rval.description = desc_tag.text.strip()
+ rval.message = 'Tool: %s %s, Version: %s, ID: %s' % ( rval.name, rval.description or '', rval.version, rval.id )
+ return rval
+ else:
+ raise DatatypeVerificationError( 'Unable to find a properly formatted tool XML file' )
+ def create_model_object( self, datatype_bunch ):
+ if self.model_object is None:
+ raise Exception( 'No model object configured for %s, please check the datatype configuration file' % self.__class__.__name__ )
+ if datatype_bunch is None:
+ # TODO: do it automatically
+ raise Exception( 'Unable to create %s model object without passing in data' % self.__class__.__name__ )
+ o = self.model_object()
+ o.create_from_datatype( datatype_bunch )
+ return o
+ def sniff( self, fname ):
+ try:
+ self.verify( open( fname, 'r' ) )
+ return True
+ except:
+ return False
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/model/__init__.py
--- a/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/webapps/community/model/__init__.py Fri Apr 23 11:30:16 2010 -0400
@@ -4,7 +4,7 @@
Naming: try to use class names that have a distinct plural form so that
the relationship cardinalities are obvious (e.g. prefer Dataset to Data)
"""
-import os.path, os, errno, sys, codecs, operator, tempfile, logging
+import os.path, os, errno, sys, codecs, operator, tempfile, logging, tarfile
from galaxy.util.bunch import Bunch
from galaxy import util
from galaxy.util.hash_util import *
@@ -86,93 +86,43 @@
self.prev_session_id = prev_session_id
class Tool( object ):
- def __init__( self, guid=None, name=None, description=None, category=None, version=None, user_id=None, external_filename=None ):
+ file_path = '/tmp'
+ def __init__( self, guid=None, tool_id=None, name=None, description=None, user_description=None, category=None, version=None, user_id=None, external_filename=None ):
self.guid = guid
+ self.tool_id = tool_id
self.name = name or "Unnamed tool"
self.description = description
+ self.user_description = user_description
self.category = category
self.version = version or "1.0.0"
self.user_id = user_id
self.external_filename = external_filename
+ def get_file_name( self ):
+ if not self.external_filename:
+ assert self.id is not None, "ID must be set before filename used (commit the object)"
+ dir = os.path.join( self.file_path, 'tools', *directory_hash_id( self.id ) )
+ # Create directory if it does not exist
+ if not os.path.exists( dir ):
+ os.makedirs( dir )
+ # Return filename inside hashed directory
+ filename = os.path.join( dir, "tool_%d.dat" % self.id )
+ else:
+ filename = self.external_filename
+ # Make filename absolute
+ return os.path.abspath( filename )
+ def set_file_name( self, filename ):
+ if not filename:
+ self.external_filename = None
+ else:
+ self.external_filename = filename
+ file_name = property( get_file_name, set_file_name )
+ def create_from_datatype( self, datatype_bunch ):
+ self.tool_id = datatype_bunch.id
+ self.name = datatype_bunch.name
+ self.version = datatype_bunch.version
+ self.description = datatype_bunch.description
+ self.user_id = datatype_bunch.user
-class Job( object ):
- """
- A job represents a request to run a tool given input datasets, tool
- parameters, and output datasets.
- """
- states = Bunch( NEW = 'new',
- UPLOAD = 'upload',
- WAITING = 'waiting',
- QUEUED = 'queued',
- RUNNING = 'running',
- OK = 'ok',
- ERROR = 'error',
- DELETED = 'deleted' )
- def __init__( self ):
- self.session_id = None
- self.tool_id = None
- self.tool_version = None
- self.command_line = None
- self.param_filename = None
- self.parameters = []
- self.input_datasets = []
- self.output_datasets = []
- self.output_library_datasets = []
- self.state = Job.states.NEW
- self.info = None
- self.job_runner_name = None
- self.job_runner_external_id = None
- def add_parameter( self, name, value ):
- self.parameters.append( JobParameter( name, value ) )
- def add_input_dataset( self, name, dataset ):
- self.input_datasets.append( JobToInputDatasetAssociation( name, dataset ) )
- def add_output_dataset( self, name, dataset ):
- self.output_datasets.append( JobToOutputDatasetAssociation( name, dataset ) )
- def add_output_library_dataset( self, name, dataset ):
- self.output_library_datasets.append( JobToOutputLibraryDatasetAssociation( name, dataset ) )
- def set_state( self, state ):
- self.state = state
- # For historical reasons state propogates down to datasets
- for da in self.output_datasets:
- da.dataset.state = state
- def get_param_values( self, app ):
- """
- Read encoded parameter values from the database and turn back into a
- dict of tool parameter values.
- """
- param_dict = dict( [ ( p.name, p.value ) for p in self.parameters ] )
- tool = app.toolbox.tools_by_id[self.tool_id]
- param_dict = tool.params_from_strings( param_dict, app )
- return param_dict
- def check_if_output_datasets_deleted( self ):
- """
- Return true if all of the output datasets associated with this job are
- in the deleted state
- """
- for dataset_assoc in self.output_datasets:
- dataset = dataset_assoc.dataset
- # only the originator of the job can delete a dataset to cause
- # cancellation of the job, no need to loop through history_associations
- if not dataset.deleted:
- return False
- return True
- def mark_deleted( self ):
- """
- Mark this job as deleted, and mark any output datasets as discarded.
- """
- self.state = Job.states.DELETED
- self.info = "Job output deleted by user before job completed."
- for dataset_assoc in self.output_datasets:
- dataset = dataset_assoc.dataset
- dataset.deleted = True
- dataset.state = dataset.states.DISCARDED
- for dataset in dataset.dataset.history_associations:
- # propagate info across shared datasets
- dataset.deleted = True
- dataset.blurb = 'deleted'
- dataset.peek = 'Job deleted'
- dataset.info = 'Job output deleted by user before job completed'
-
class Tag ( object ):
def __init__( self, id=None, type=None, parent_id=None, name=None ):
self.id = id
@@ -182,6 +132,12 @@
def __str__ ( self ):
return "Tag(id=%s, type=%i, parent_id=%s, name=%s)" % ( self.id, self.type, self.parent_id, self.name )
+class Category( object ):
+ def __init__( self, id=None, name=None, description=None ):
+ self.id = id
+ self.name = name
+ self.description = description
+
class ItemTagAssociation ( object ):
def __init__( self, id=None, user=None, item_id=None, tag_id=None, user_tname=None, value=None ):
self.id = id
@@ -198,6 +154,12 @@
class ToolAnnotationAssociation( object ):
pass
+class ToolCategoryAssociation( object ):
+ def __init__( self, id=None, tool=None, category=None ):
+ self.id = id
+ self.tool = tool
+ self.category = category
+
## ---- Utility methods -------------------------------------------------------
def directory_hash_id( id ):
@@ -207,7 +169,7 @@
if l < 4:
return [ "000" ]
# Pad with zeros until a multiple of three
- padded = ( ( 3 - len( s ) % 3 ) * "0" ) + s
+ padded = ( ( ( 3 - len( s ) ) % 3 ) * "0" ) + s
# Drop the last three digits -- 1000 files per directory
padded = padded[:-3]
# Break into chunks of three
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/model/mapping.py
--- a/lib/galaxy/webapps/community/model/mapping.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/webapps/community/model/mapping.py Fri Apr 23 11:30:16 2010 -0400
@@ -103,32 +103,28 @@
Tool.table = Table( "tool", metadata,
Column( "id", Integer, primary_key=True ),
Column( "guid", TrimmedString( 255 ), index=True, unique=True ),
+ Column( "tool_id", TrimmedString( 255 ), index=True, unique=True ),
Column( "create_time", DateTime, default=now ),
Column( "update_time", DateTime, default=now, onupdate=now ),
- Column( "name", TrimmedString( 255 ), index=True, unique=True ),
+ Column( "name", TrimmedString( 255 ), index=True ),
Column( "description" , TEXT ),
- Column( "category", TrimmedString( 255 ), index=True ),
+ Column( "user_description" , TEXT ),
Column( "version", TrimmedString( 255 ) ),
Column( "user_id", Integer, ForeignKey( "galaxy_user.id" ), index=True ),
Column( "external_filename" , TEXT ),
Column( "deleted", Boolean, default=False ) )
-Job.table = Table( "job", metadata,
+Category.table = Table( "category", metadata,
Column( "id", Integer, primary_key=True ),
Column( "create_time", DateTime, default=now ),
Column( "update_time", DateTime, default=now, onupdate=now ),
+ Column( "name", TrimmedString( 255 ), index=True, unique=True ),
+ Column( "description" , TEXT ) )
+
+ToolCategoryAssociation.table = Table( "tool_category_association", metadata,
+ Column( "id", Integer, primary_key=True ),
Column( "tool_id", Integer, ForeignKey( "tool.id" ), index=True ),
- Column( "state", String( 64 ), index=True ),
- Column( "info", TrimmedString( 255 ) ),
- Column( "command_line", TEXT ),
- Column( "param_filename", String( 1024 ) ),
- Column( "runner_name", String( 255 ) ),
- Column( "stdout", TEXT ),
- Column( "stderr", TEXT ),
- Column( "traceback", TEXT ),
- Column( "session_id", Integer, ForeignKey( "galaxy_session.id" ), index=True, nullable=True ),
- Column( "job_runner_name", String( 255 ) ),
- Column( "job_runner_external_id", String( 255 ) ) )
+ Column( "category_id", Integer, ForeignKey( "category.id" ), index=True ) )
Tag.table = Table( "tag", metadata,
Column( "id", Integer, primary_key=True ),
@@ -193,10 +189,6 @@
assign_mapper( context, GalaxySession, GalaxySession.table,
properties=dict( user=relation( User.mapper ) ) )
-assign_mapper( context, Job, Job.table,
- properties=dict( galaxy_session=relation( GalaxySession ),
- tool=relation( Tool ) ) )
-
assign_mapper( context, Tag, Tag.table,
properties=dict( children=relation(Tag, backref=backref( 'parent', remote_side=[Tag.table.c.id] ) ) ) )
@@ -207,7 +199,22 @@
properties=dict( tool=relation( Tool ), user=relation( User ) ) )
assign_mapper( context, Tool, Tool.table,
- properties = dict( user=relation( User.mapper ) ) )
+ properties = dict(
+ categories=relation( ToolCategoryAssociation ),
+ user=relation( User.mapper )
+ )
+)
+
+assign_mapper( context, Category, Category.table,
+ properties=dict( tools=relation( ToolCategoryAssociation ) ) )
+
+assign_mapper( context, ToolCategoryAssociation, ToolCategoryAssociation.table,
+ properties=dict(
+ category=relation(Category),
+ tool=relation(Tool)
+ )
+)
+
def guess_dialect_for_url( url ):
return (url.split(':', 1))[0]
diff -r 742fa2afcad9 -r a77ec2944999 lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py
--- a/lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py Fri Apr 23 11:14:26 2010 -0400
+++ b/lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py Fri Apr 23 11:30:16 2010 -0400
@@ -80,32 +80,28 @@
Tool_table = Table( "tool", metadata,
Column( "id", Integer, primary_key=True ),
Column( "guid", TrimmedString( 255 ), index=True, unique=True ),
+ Column( "tool_id", TrimmedString( 255 ), index=True, unique=True ),
Column( "create_time", DateTime, default=now ),
Column( "update_time", DateTime, default=now, onupdate=now ),
- Column( "name", TrimmedString( 255 ), index=True, unique=True ),
+ Column( "name", TrimmedString( 255 ), index=True ),
Column( "description" , TEXT ),
- Column( "category", TrimmedString( 255 ), index=True ),
+ Column( "user_description" , TEXT ),
Column( "version", TrimmedString( 255 ) ),
Column( "user_id", Integer, ForeignKey( "galaxy_user.id" ), index=True ),
Column( "external_filename" , TEXT ),
Column( "deleted", Boolean, default=False ) )
-Job_table = Table( "job", metadata,
+Category_table = Table( "category", metadata,
Column( "id", Integer, primary_key=True ),
Column( "create_time", DateTime, default=now ),
Column( "update_time", DateTime, default=now, onupdate=now ),
+ Column( "name", TrimmedString( 255 ), index=True, unique=True ),
+ Column( "description" , TEXT ) )
+
+ToolCategoryAssociation_table = Table( "tool_category_association", metadata,
+ Column( "id", Integer, primary_key=True ),
Column( "tool_id", Integer, ForeignKey( "tool.id" ), index=True ),
- Column( "state", String( 64 ), index=True ),
- Column( "info", TrimmedString( 255 ) ),
- Column( "command_line", TEXT ),
- Column( "param_filename", String( 1024 ) ),
- Column( "runner_name", String( 255 ) ),
- Column( "stdout", TEXT ),
- Column( "stderr", TEXT ),
- Column( "traceback", TEXT ),
- Column( "session_id", Integer, ForeignKey( "galaxy_session.id" ), index=True, nullable=True ),
- Column( "job_runner_name", String( 255 ) ),
- Column( "job_runner_external_id", String( 255 ) ) )
+ Column( "category_id", Integer, ForeignKey( "category.id" ), index=True ) )
Tag_table = Table( "tag", metadata,
Column( "id", Integer, primary_key=True ),
diff -r 742fa2afcad9 -r a77ec2944999 templates/webapps/community/tool/edit_tool.mako
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/templates/webapps/community/tool/edit_tool.mako Fri Apr 23 11:30:16 2010 -0400
@@ -0,0 +1,73 @@
+<%namespace file="/message.mako" import="render_msg" />
+
+<%!
+ def inherit(context):
+ if context.get('use_panels'):
+ return '/webapps/community/base_panels.mako'
+ else:
+ return '/base.mako'
+%>
+<%inherit file="${inherit(context)}"/>
+
+<%def name="title()">Edit Tool</%def>
+
+<h2>Edit Tool: ${tool.name} ${tool.version} (${tool.tool_id})</h2>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<form id="tool_edit_form" name="tool_edit_form" action="${h.url_for( controller='tool_browser', action='edit_tool' )}" enctype="multipart/form-data" method="post">
+<input type="hidden" name="id" value="${encoded_id}"/>
+<div class="toolForm">
+ <div class="toolFormTitle">Edit Tool</div>
+ <div class="toolFormBody">
+ <div class="form-row">
+ <label>Categories:</label>
+ <div class="form-row-input">
+ <select name="category" multiple size=5>
+ %for category in categories:
+ %if category.id in [ tool_category.id for tool_category in tool.categories ]:
+ <option value="${category.id}" selected>${category.name}</option>
+ %else:
+ <option value="${category.id}">${category.name}</option>
+ %endif
+ %endfor
+ </select>
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>Description:</label>
+ <div class="form-row-input"><textarea name="description" rows="5" cols="35"></textarea></div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <input type="submit" class="primary-button" name="save_button" value="Save">
+ </div>
+ </div>
+</div>
+
+<p/>
+
+<div class="toolForm">
+ <div class="toolFormTitle">Upload new version</div>
+ <div class="toolFormBody">
+ <div class="form-row">
+ <label>File:</label>
+ <div class="form-row-input"><input type="file" name="file_data"/></div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>URL:</label>
+ <div class="form-row-input"><input type="text" name="url" style="width: 100%;"/></div>
+ <div class="toolParamHelp" style="clear: both;">
+ Instead of uploading directly from your computer, you may instruct Galaxy to download the file from a Web or FTP address.
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <input type="submit" class="primary-button" name="save_button" value="Save">
+ </div>
+</div>
+</form>
diff -r 742fa2afcad9 -r a77ec2944999 templates/webapps/community/upload/upload.mako
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/templates/webapps/community/upload/upload.mako Fri Apr 23 11:30:16 2010 -0400
@@ -0,0 +1,66 @@
+<%namespace file="/message.mako" import="render_msg" />
+
+<%!
+ def inherit(context):
+ if context.get('use_panels'):
+ return '/webapps/community/base_panels.mako'
+ else:
+ return '/base.mako'
+%>
+<%inherit file="${inherit(context)}"/>
+
+<%def name="title()">Upload</%def>
+
+<h2>Upload</h2>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<div class="toolForm">
+ <div class="toolFormTitle">Upload</div>
+ <div class="toolFormBody">
+ ## TODO: nginx
+ <form id="upload_form" name="upload_form" action="${h.url_for( controller='upload', action='upload' )}" enctype="multipart/form-data" method="post">
+ <div class="form-row">
+ <label>Upload Type</label>
+ <div class="form-row-input">
+ <select name="upload_type">
+ %for type_id, type_name in upload_types:
+ %if type_id == selected_upload_type:
+ <option value="${type_id}" selected>${type_name}</option>
+ %else:
+ <option value="${type_id}">${type_name}</option>
+ %endif
+ %endfor
+ </select>
+ </div>
+ <div class="toolParamHelp" style="clear: both;">
+ Need help creating a tool file? See help below.
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>File:</label>
+ <div class="form-row-input"><input type="file" name="file_data"/></div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>URL:</label>
+ <div class="form-row-input"><input type="text" name="url" style="width: 100%;"/></div>
+ <div class="toolParamHelp" style="clear: both;">
+ Instead of uploading directly from your computer, you may instruct Galaxy to download the file from a Web or FTP address.
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <input type="submit" class="primary-button" name="upload_button" value="Upload">
+ </div>
+ </form>
+ </div>
+</div>
+<div class="toolHelp">
+ <div class="toolHelpBody">
+ <p><strong>Tool Files</strong></p>
+ </div>
+</div>
1
0

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/742fa2afcad9
changeset: 3683:742fa2afcad9
user: Dan Blankenberg <dan(a)bx.psu.edu>
date: Fri Apr 23 11:14:26 2010 -0400
description:
Updates for 'Profile Annotations for a set of genomic intervals' tool. This tool will now report a 'data version'. Add a script that creates the indexes and table description xml from a UCSC database dump.
diffstat:
lib/galaxy/tools/parameters/basic.py | 12 +-
scripts/tools/annotation_profiler/README.txt | 54 +
scripts/tools/annotation_profiler/build_profile_indexes.py | 338 ++++++++++
tool-data/annotation_profiler_options.xml.sample | 2 +-
tools/annotation_profiler/annotation_profiler.xml | 4 +-
tools/annotation_profiler/annotation_profiler_for_interval.py | 74 +-
6 files changed, 449 insertions(+), 35 deletions(-)
diffs (603 lines):
diff -r c37de7a983e7 -r 742fa2afcad9 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py Thu Apr 22 21:11:17 2010 -0400
+++ b/lib/galaxy/tools/parameters/basic.py Fri Apr 23 11:14:26 2010 -0400
@@ -960,8 +960,11 @@
if filter.get( 'type' ) == 'data_meta':
if filter.get( 'data_ref' ) not in self.filtered:
self.filtered[filter.get( 'data_ref' )] = {}
- self.filtered[filter.get( 'data_ref' )][filter.get( 'meta_key' )] = { 'value': filter.get( 'value' ), 'options':[] }
- recurse_option_elems( self.filtered[filter.get( 'data_ref' )][filter.get( 'meta_key' )]['options'], filter.find( 'options' ).findall( 'option' ) )
+ if filter.get( 'meta_key' ) not in self.filtered[filter.get( 'data_ref' )]:
+ self.filtered[filter.get( 'data_ref' )][filter.get( 'meta_key' )] = {}
+ if filter.get( 'value' ) not in self.filtered[filter.get( 'data_ref' )][filter.get( 'meta_key' )]:
+ self.filtered[filter.get( 'data_ref' )][filter.get( 'meta_key' )][filter.get( 'value' )] = []
+ recurse_option_elems( self.filtered[filter.get( 'data_ref' )][filter.get( 'meta_key' )][filter.get( 'value' )], filter.find( 'options' ).findall( 'option' ) )
else:
recurse_option_elems( self.options, elem.find( 'options' ).findall( 'option' ) )
@@ -974,8 +977,9 @@
dataset = dataset.dataset
if dataset:
for meta_key, meta_dict in filter_value.iteritems():
- if dataset.metadata.spec[meta_key].param.to_string( dataset.metadata.get( meta_key ) ) == meta_dict['value']:
- options.extend( meta_dict['options'] )
+ check_meta_val = dataset.metadata.spec[meta_key].param.to_string( dataset.metadata.get( meta_key ) )
+ if check_meta_val in meta_dict:
+ options.extend( meta_dict[check_meta_val] )
return options
return self.options
diff -r c37de7a983e7 -r 742fa2afcad9 scripts/tools/annotation_profiler/README.txt
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/scripts/tools/annotation_profiler/README.txt Fri Apr 23 11:14:26 2010 -0400
@@ -0,0 +1,54 @@
+This file explains how to create annotation indexes for the annotation profiler tool. Annotation profiler indexes are an exceedingly simple binary format,
+containing no header information and consisting of an ordered linear list of (start,stop encoded individually as '<I') regions which are covered by a UCSC table partitioned
+by chromosome name. Genomic regions are merged by overlap / direct adjacency (e.g. a table having ranges of: 1-10, 6-12, 12-20 and 25-28 results in two merged ranges of: 1-20 and 25-28).
+
+Files are arranged like:
+/profiled_annotations/DBKEY/TABLE_NAME/
+ CHROMOSOME_NAME.covered
+ CHROMOSOME_NAME.total_coverage
+ CHROMOSOME_NAME.total_regions
+/profiled_annotations/DBKEY/
+ DBKEY_tables.xml
+ chromosomes.txt
+ profiled_info.txt
+
+
+where CHROMOSOME_NAME.covered is the binary file, CHROMOSOME_NAME.total_coverage is a text file containing the integer count of bases covered by the
+table and CHROMOSOME_NAME.total_regions contains the integer count of the number of regions found in CHROMOSOME_NAME.covered
+
+DBKEY_tables.xml should be appended to the annotation profile available table configuration file (tool-data/annotation_profiler_options.xml).
+The DBKEY should also be added as a new line to the annotation profiler valid builds file (annotation_profiler_valid_builds.txt).
+The output (/profiled_annotations/DBKEY) should be made available as GALAXY_ROOT/tool-data/annotation_profiler/DBKEY.
+
+profiled_info.txt contains info on the generated annotations, separated by lines with tab-delimited label,value pairs:
+ profiler_version - the version of the build_profile_indexes.py script that was used to generate the profiled data
+ dbkey - the dbkey used for the run
+ chromosomes - contains the names and lengths of chromosomes that were used to parse single-chromosome tables (tables divided into individual files by chromosome)
+ dump_time - the declared dump time of the database, taken from trackDb.txt.gz
+ profiled_time - seconds since epoch in utc for when the database dump was profiled
+ database_hash - a md5 hex digest of all the profiled table info
+
+
+Typical usage includes:
+
+python build_profile_indexes.py -d hg19 -i /ucsc_data/hg19/database/ > hg19.txt
+
+where the genome build is hg19 and /ucsc_data/hg19/database/ contains the downloaded database dump from UCSC (e.g. obtained by rsync: rsync -avzP rsync://hgdownload.cse.ucsc.edu/goldenPath/hg19/database/ /ucsc_data/hg19/database/).
+
+
+
+By default, chromosome names come from a file named 'chromInfo.txt.gz' found in the input directory, with FTP used as a backup.
+When FTP is used to obtain the names of chromosomes from UCSC for a particular genome build, alternate ftp sites and paths can be specified by using the --ftp_site and --ftp_path attributes.
+Chromosome names can instead be provided on the commandline via the --chromosomes option, which accepts a comma separated list of:ChromName1[=length],ChromName2[=length],...
+
+
+
+ usage = "usage: %prog options"
+ parser = OptionParser( usage=usage )
+ parser.add_option( '-d', '--dbkey', dest='dbkey', default='hg18', help='dbkey to process' )
+ parser.add_option( '-i', '--input_dir', dest='input_dir', default=os.path.join( 'golden_path','%s', 'database' ), help='Input Directory' )
+ parser.add_option( '-o', '--output_dir', dest='output_dir', default=os.path.join( 'profiled_annotations','%s' ), help='Output Directory' )
+ parser.add_option( '-c', '--chromosomes', dest='chromosomes', default='', help='Comma separated list of: ChromName1[=length],ChromName2[=length],...' )
+ parser.add_option( '-b', '--bitset_size', dest='bitset_size', default=DEFAULT_BITSET_SIZE, type='int', help='Default BitSet size; overridden by sizes specified in chromInfo.txt.gz or by --chromosomes' )
+ parser.add_option( '-f', '--ftp_site', dest='ftp_site', default='hgdownload.cse.ucsc.edu', help='FTP site; used for chromosome info when chromInfo.txt.gz method fails' )
+ parser.add_option( '-p', '--ftp_path', dest='ftp_path', default='/goldenPath/%s/chromosomes/', help='FTP Path; used for chromosome info when chromInfo.txt.gz method fails' )
diff -r c37de7a983e7 -r 742fa2afcad9 scripts/tools/annotation_profiler/build_profile_indexes.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/scripts/tools/annotation_profiler/build_profile_indexes.py Fri Apr 23 11:14:26 2010 -0400
@@ -0,0 +1,338 @@
+#!/usr/bin/env python
+#Dan Blankenberg
+
+VERSION = '1.0.0' # version of this script
+
+from optparse import OptionParser
+import os, gzip, struct, time
+from ftplib import FTP #do we want a diff method than using FTP to determine Chrom Names, eg use local copy
+
+#import md5 from hashlib; if python2.4 or less, use old md5
+try:
+ from hashlib import md5
+except ImportError:
+ from md5 import new as md5
+
+#import BitSet from bx-python, try using eggs and package resources, fall back to any local installation
+try:
+ from galaxy import eggs
+ import pkg_resources
+ pkg_resources.require( "bx-python" )
+except: pass #Maybe there is a local installation available
+from bx.bitset import BitSet
+
+#Define constants
+STRUCT_FMT = '<I'
+STRUCT_SIZE = struct.calcsize( STRUCT_FMT )
+DEFAULT_BITSET_SIZE = 300000000
+CHUNK_SIZE = 1024
+
+#Headers used to parse .sql files to determine column indexes for chromosome name, start and end
+alias_spec = {
+ 'chromCol' : [ 'chrom' , 'CHROMOSOME' , 'CHROM', 'Chromosome Name', 'tName' ],
+ 'startCol' : [ 'start' , 'START', 'chromStart', 'txStart', 'Start Position (bp)', 'tStart', 'genoStart' ],
+ 'endCol' : [ 'end' , 'END' , 'STOP', 'chromEnd', 'txEnd', 'End Position (bp)', 'tEnd', 'genoEnd' ],
+}
+
+#Headers used to parse trackDb.txt.gz
+#TODO: these should be parsed directly from trackDb.sql
+trackDb_headers = ["tableName", "shortLabel", "type", "longLabel", "visibility", "priority", "colorR", "colorG", "colorB", "altColorR", "altColorG", "altColorB", "useScore", "private", "restrictCount", "restrictList", "url", "html", "grp", "canPack", "settings"]
+
+def get_columns( filename ):
+ input_sql = open( filename ).read()
+ input_sql = input_sql.split( 'CREATE TABLE ' )[1].split( ';' )[0]
+ input_sql = input_sql.split( ' (', 1 )
+ table_name = input_sql[0].strip().strip( '`' )
+ input_sql = [ split.strip().split( ' ' )[0].strip().strip( '`' ) for split in input_sql[1].rsplit( ')', 1 )[0].strip().split( '\n' ) ]
+ print input_sql
+ chrom_col = None
+ start_col = None
+ end_col = None
+ for col_name in alias_spec['chromCol']:
+ for i, header_name in enumerate( input_sql ):
+ if col_name == header_name:
+ chrom_col = i
+ break
+ if chrom_col is not None:
+ break
+
+ for col_name in alias_spec['startCol']:
+ for i, header_name in enumerate( input_sql ):
+ if col_name == header_name:
+ start_col = i
+ break
+ if start_col is not None:
+ break
+
+ for col_name in alias_spec['endCol']:
+ for i, header_name in enumerate( input_sql ):
+ if col_name == header_name:
+ end_col = i
+ break
+ if end_col is not None:
+ break
+
+ return table_name, chrom_col, start_col, end_col
+
+
+def create_grouping_xml( input_dir, output_dir, dbkey ):
+ output_filename = os.path.join( output_dir, '%s_tables.xml' % dbkey )
+ def load_groups( file_name = 'grp.txt.gz' ):
+ groups = {}
+ for line in gzip.open( os.path.join( input_dir, file_name ) ):
+ fields = line.split( '\t' )
+ groups[fields[0]] = { 'desc': fields[1], 'priority': fields[2] }
+ return groups
+ f = gzip.open( os.path.join( input_dir, 'trackDb.txt.gz' ) )
+ out = open( output_filename, 'wb' )
+ tables = {}
+ cur_buf = ''
+ while True:
+ line = f.readline()
+ if not line: break
+ #remove new lines
+ line = line.rstrip( '\n\r' )
+ line = line.replace( '\\\t', ' ' ) #replace escaped tabs with space
+ cur_buf += "%s\n" % line.rstrip( '\\' )
+ if line.endswith( '\\' ):
+ continue #line is wrapped, next line
+ #all fields should be loaded now...
+ fields = cur_buf.split( '\t' )
+ cur_buf = '' #reset buffer
+ assert len( fields ) == len( trackDb_headers ), 'Failed Parsing trackDb.txt.gz; fields: %s' % fields
+ table_name = fields[ 0 ]
+ tables[ table_name ] = {}
+ for field_name, field_value in zip( trackDb_headers, fields ):
+ tables[ table_name ][ field_name ] = field_value
+ #split settings fields into dict
+ fields = fields[-1].split( '\n' )
+ tables[ table_name ][ 'settings' ] = {}
+ for field in fields:
+ setting_fields = field.split( ' ', 1 )
+ setting_name = setting_value = setting_fields[ 0 ]
+ if len( setting_fields ) > 1:
+ setting_value = setting_fields[ 1 ]
+ if setting_name or setting_value:
+ tables[ table_name ][ 'settings' ][ setting_name ] = setting_value
+ #Load Groups
+ groups = load_groups()
+ in_groups = {}
+ for table_name, values in tables.iteritems():
+ if os.path.exists( os.path.join( output_dir, table_name ) ):
+ group = values['grp']
+ if group not in in_groups:
+ in_groups[group]={}
+ #***NAME CHANGE***, 'subTrack' no longer exists as a setting...use 'parent' instead
+ #subTrack = values.get('settings', {} ).get( 'subTrack', table_name )
+ subTrack = values.get('settings', {} ).get( 'parent', table_name ).split( ' ' )[0] #need to split, because could be e.g. 'trackgroup on'
+ if subTrack not in in_groups[group]:
+ in_groups[group][subTrack]=[]
+ in_groups[group][subTrack].append( table_name )
+
+ assigned_tables = []
+ out.write( """<filter type="data_meta" data_ref="input1" meta_key="dbkey" value="%s">\n""" % ( dbkey ) )
+ out.write( " <options>\n" )
+ for group, subTracks in sorted( in_groups.iteritems() ):
+ out.write( """ <option name="%s" value="group-%s">\n""" % ( groups[group]['desc'], group ) )
+ for sub_name, sub_tracks in subTracks.iteritems():
+ if len( sub_tracks ) > 1:
+ out.write( """ <option name="%s" value="subtracks-%s">\n""" % ( sub_name, sub_name ) )
+ sub_tracks.sort()
+ for track in sub_tracks:
+ track_label = track
+ if "$" not in tables[track]['shortLabel']:
+ track_label = tables[track]['shortLabel']
+ out.write( """ <option name="%s" value="%s"/>\n""" % ( track_label, track ) )
+ assigned_tables.append( track )
+ out.write( " </option>\n" )
+ else:
+ track = sub_tracks[0]
+ track_label = track
+ if "$" not in tables[track]['shortLabel']:
+ track_label = tables[track]['shortLabel']
+ out.write( """ <option name="%s" value="%s"/>\n""" % ( track_label, track ) )
+ assigned_tables.append( track )
+ out.write( " </option>\n" )
+ unassigned_tables = list( sorted( [ table_dir for table_dir in os.listdir( output_dir ) if table_dir not in assigned_tables and os.path.isdir( os.path.join( output_dir, table_dir ) ) ] ) )
+ if unassigned_tables:
+ out.write( """ <option name="Uncategorized Tables" value="group-trackDbUnassigned">\n""" )
+ for table_name in unassigned_tables:
+ out.write( """ <option name="%s" value="%s"/>\n""" % ( table_name, table_name ) )
+ out.write( " </option>\n" )
+ out.write( " </options>\n" )
+ out.write( """</filter>\n""" )
+ out.close()
+
+def write_database_dump_info( input_dir, output_dir, dbkey, chrom_lengths, default_bitset_size ):
+ #generate hash for profiled table directories
+ #sort directories off output root (files in output root not hashed, including the profiler_info.txt file)
+ #sort files in each directory and hash file contents
+ profiled_hash = md5()
+ for table_dir in sorted( [ table_dir for table_dir in os.listdir( output_dir ) if os.path.isdir( os.path.join( output_dir, table_dir ) ) ] ):
+ for filename in sorted( os.listdir( os.path.join( output_dir, table_dir ) ) ):
+ f = open( os.path.join( output_dir, table_dir, filename ), 'rb' )
+ while True:
+ hash_chunk = f.read( CHUNK_SIZE )
+ if not hash_chunk:
+ break
+ profiled_hash.update( hash_chunk )
+ profiled_hash = profiled_hash.hexdigest()
+
+ #generate hash for input dir
+ #sort directories off input root
+ #sort files in each directory and hash file contents
+ database_hash = md5()
+ for dirpath, dirnames, filenames in sorted( os.walk( input_dir ) ):
+ for filename in sorted( filenames ):
+ f = open( os.path.join( input_dir, dirpath, filename ), 'rb' )
+ while True:
+ hash_chunk = f.read( CHUNK_SIZE )
+ if not hash_chunk:
+ break
+ database_hash.update( hash_chunk )
+ database_hash = database_hash.hexdigest()
+
+ #write out info file
+ out = open( os.path.join( output_dir, 'profiler_info.txt' ), 'wb' )
+ out.write( 'dbkey\t%s\n' % ( dbkey ) )
+ out.write( 'chromosomes\t%s\n' % ( ','.join( [ '%s=%s' % ( chrom_name, chrom_len ) for chrom_name, chrom_len in chrom_lengths.iteritems() ] ) ) )
+ out.write( 'bitset_size\t%s\n' % ( default_bitset_size ) )
+ for line in open( os.path.join( input_dir, 'trackDb.sql' ) ):
+ line = line.strip()
+ if line.startswith( '-- Dump completed on ' ):
+ line = line[ len( '-- Dump completed on ' ): ]
+ out.write( 'dump_time\t%s\n' % ( line ) )
+ break
+ out.write( 'dump_hash\t%s\n' % ( database_hash ) )
+ out.write( 'profiler_time\t%s\n' % ( time.time() ) )
+ out.write( 'profiler_hash\t%s\n' % ( profiled_hash ) )
+ out.write( 'profiler_version\t%s\n' % ( VERSION ) )
+ out.write( 'profiler_struct_format\t%s\n' % ( STRUCT_FMT ) )
+ out.write( 'profiler_struct_size\t%s\n' % ( STRUCT_SIZE ) )
+ out.close()
+
+def __main__():
+ usage = "usage: %prog options"
+ parser = OptionParser( usage=usage )
+ parser.add_option( '-d', '--dbkey', dest='dbkey', default='hg18', help='dbkey to process' )
+ parser.add_option( '-i', '--input_dir', dest='input_dir', default=os.path.join( 'golden_path','%s', 'database' ), help='Input Directory' )
+ parser.add_option( '-o', '--output_dir', dest='output_dir', default=os.path.join( 'profiled_annotations','%s' ), help='Output Directory' )
+ parser.add_option( '-c', '--chromosomes', dest='chromosomes', default='', help='Comma separated list of: ChromName1[=length],ChromName2[=length],...' )
+ parser.add_option( '-b', '--bitset_size', dest='bitset_size', default=DEFAULT_BITSET_SIZE, type='int', help='Default BitSet size; overridden by sizes specified in chromInfo.txt.gz or by --chromosomes' )
+ parser.add_option( '-f', '--ftp_site', dest='ftp_site', default='hgdownload.cse.ucsc.edu', help='FTP site; used for chromosome info when chromInfo.txt.gz method fails' )
+ parser.add_option( '-p', '--ftp_path', dest='ftp_path', default='/goldenPath/%s/chromosomes/', help='FTP Path; used for chromosome info when chromInfo.txt.gz method fails' )
+
+ ( options, args ) = parser.parse_args()
+
+ input_dir = options.input_dir
+ if '%' in input_dir:
+ input_dir = input_dir % options.dbkey
+ assert os.path.exists( input_dir ), 'Input directory does not exist'
+ output_dir = options.output_dir
+ if '%' in output_dir:
+ output_dir = output_dir % options.dbkey
+ assert not os.path.exists( output_dir ), 'Output directory already exists'
+ os.makedirs( output_dir )
+ ftp_path = options.ftp_path
+ if '%' in ftp_path:
+ ftp_path = ftp_path % options.dbkey
+
+ #Get chromosome names and lengths
+ chrom_lengths = {}
+ if options.chromosomes:
+ for chrom in options.chromosomes.split( ',' ):
+ fields = chrom.split( '=' )
+ chrom = fields[0]
+ if len( fields ) > 1:
+ chrom_len = int( fields[1] )
+ else:
+ chrom_len = options.bitset_size
+ chrom_lengths[ chrom ] = chrom_len
+ chroms = chrom_lengths.keys()
+ print 'Chrom info taken from command line option.'
+ else:
+ try:
+ for line in gzip.open( os.path.join( input_dir, 'chromInfo.txt.gz' ) ):
+ fields = line.strip().split( '\t' )
+ chrom_lengths[ fields[0] ] = int( fields[ 1 ] )
+ chroms = chrom_lengths.keys()
+ print 'Chrom info taken from chromInfo.txt.gz.'
+ except Exception, e:
+ print 'Error loading chrom info from chromInfo.txt.gz, trying FTP method.'
+ chrom_lengths = {} #zero out chrom_lengths
+ chroms = []
+ ftp = FTP( options.ftp_site )
+ ftp.login()
+ for name in ftp.nlst( ftp_path ):
+ if name.endswith( '.fa.gz' ):
+ chroms.append( name.split( '/' )[-1][ :-len( '.fa.gz' ) ] )
+ ftp.close()
+ for chrom in chroms:
+ chrom_lengths[ chrom ] = options.bitset_size
+ #sort chroms by length of name, decending; necessary for when table names start with chrom name
+ chroms = list( reversed( [ chrom for chrom_len, chrom in sorted( [ ( len( chrom ), chrom ) for chrom in chroms ] ) ] ) )
+
+ #parse tables from local files
+ #loop through directory contents, if file ends in '.sql', process table
+ for filename in os.listdir( input_dir ):
+ if filename.endswith ( '.sql' ):
+ base_filename = filename[ 0:-len( '.sql' ) ]
+ table_out_dir = os.path.join( output_dir, base_filename )
+ #some tables are chromosome specific, lets strip off the chrom name
+ for chrom in chroms:
+ if base_filename.startswith( "%s_" % chrom ):
+ #found chromosome
+ table_out_dir = os.path.join( output_dir, base_filename[len( "%s_" % chrom ):] )
+ break
+ #create table dir
+ if not os.path.exists( table_out_dir ):
+ os.mkdir( table_out_dir ) #table dir may already exist in the case of single chrom tables
+ print "Created table dir (%s)." % table_out_dir
+ else:
+ print "Table dir (%s) already exists." % table_out_dir
+ #find column assignments
+ table_name, chrom_col, start_col, end_col = get_columns( "%s.sql" % os.path.join( input_dir, base_filename ) )
+ if chrom_col is None or start_col is None or end_col is None:
+ print "Table %s (%s) does not appear to have a chromosome, a start, or a stop." % ( table_name, "%s.sql" % os.path.join( input_dir, base_filename ) )
+ if not os.listdir( table_out_dir ):
+ print "Removing empty table (%s) directory (%s)." % ( table_name, table_out_dir )
+ os.rmdir( table_out_dir )
+ continue
+ #build bitsets from table
+ bitset_dict = {}
+ for line in gzip.open( '%s.txt.gz' % os.path.join( input_dir, base_filename ) ):
+ fields = line.strip().split( '\t' )
+ chrom = fields[ chrom_col ]
+ start = int( fields[ start_col ] )
+ end = int( fields[ end_col ] )
+ if chrom not in bitset_dict:
+ bitset_dict[ chrom ] = BitSet( chrom_lengths.get( chrom, options.bitset_size ) )
+ bitset_dict[ chrom ].set_range( start, end - start )
+ #write bitsets as profiled annotations
+ for chrom_name, chrom_bits in bitset_dict.iteritems():
+ out = open( os.path.join( table_out_dir, '%s.covered' % chrom_name ), 'wb' )
+ end = 0
+ total_regions = 0
+ total_coverage = 0
+ max_size = chrom_lengths.get( chrom_name, options.bitset_size )
+ while True:
+ start = chrom_bits.next_set( end )
+ if start >= max_size:
+ break
+ end = chrom_bits.next_clear( start )
+ out.write( struct.pack( STRUCT_FMT, start ) )
+ out.write( struct.pack( STRUCT_FMT, end ) )
+ total_regions += 1
+ total_coverage += end - start
+ if end >= max_size:
+ break
+ out.close()
+ open( os.path.join( table_out_dir, '%s.total_regions' % chrom_name ), 'wb' ).write( str( total_regions ) )
+ open( os.path.join( table_out_dir, '%s.total_coverage' % chrom_name ), 'wb' ).write( str( total_coverage ) )
+
+ #create xml
+ create_grouping_xml( input_dir, output_dir, options.dbkey )
+ #create database dump info file, for database version control
+ write_database_dump_info( input_dir, output_dir, options.dbkey, chrom_lengths, options.bitset_size )
+
+if __name__ == "__main__": __main__()
diff -r c37de7a983e7 -r 742fa2afcad9 tool-data/annotation_profiler_options.xml.sample
--- a/tool-data/annotation_profiler_options.xml.sample Thu Apr 22 21:11:17 2010 -0400
+++ b/tool-data/annotation_profiler_options.xml.sample Fri Apr 23 11:14:26 2010 -0400
@@ -1,4 +1,4 @@
-<filter type="meta_key" name="dbkey" value="hg18">
+<filter type="data_meta" data_ref="input1" meta_key="dbkey" value="hg18">
<options>
<option name="Mapping and Sequencing Tracks" value="group-map">
<option name="STS Markers" value="stsMap"/>
diff -r c37de7a983e7 -r 742fa2afcad9 tools/annotation_profiler/annotation_profiler.xml
--- a/tools/annotation_profiler/annotation_profiler.xml Thu Apr 22 21:11:17 2010 -0400
+++ b/tools/annotation_profiler/annotation_profiler.xml Fri Apr 23 11:14:26 2010 -0400
@@ -1,6 +1,6 @@
<tool id="Annotation_Profiler_0" name="Profile Annotations" Version="1.0.0">
<description>for a set of genomic intervals</description>
- <command interpreter="python">annotation_profiler_for_interval.py -i $input1 -c ${input1.metadata.chromCol} -s ${input1.metadata.startCol} -e ${input1.metadata.endCol} -o $out_file1 $keep_empty -p ${GALAXY_DATA_INDEX_DIR}/annotation_profiler/$dbkey $summary -l ${chromInfo} -b 3 -t $table_names</command>
+ <command interpreter="python">annotation_profiler_for_interval.py -i $input1 -c ${input1.metadata.chromCol} -s ${input1.metadata.startCol} -e ${input1.metadata.endCol} -o $out_file1 $keep_empty -p ${GALAXY_DATA_INDEX_DIR}/annotation_profiler/$dbkey $summary -b 3 -t $table_names</command>
<inputs>
<param format="interval" name="input1" type="data" label="Choose Intervals">
<validator type="dataset_metadata_in_file" filename="annotation_profiler_valid_builds.txt" metadata_name="dbkey" metadata_column="0" message="Profiling is not currently available for this species."/>
@@ -41,7 +41,7 @@
<help>
**What it does**
-Takes an input set of intervals and for each interval determines the base coverage of the interval by a set of features (tables) available from UCSC.
+Takes an input set of intervals and for each interval determines the base coverage of the interval by a set of features (tables) available from UCSC. Genomic regions from the input feature data have been merged by overlap / direct adjacency (e.g. a table having ranges of: 1-10, 6-12, 12-20 and 25-28 results in two merged ranges of: 1-20 and 25-28).
By default, this tool will check the coverage of your intervals against all available features; you may, however, choose to select only those tables that you want to include. Selecting a section heading will effectively cause all of its children to be selected.
diff -r c37de7a983e7 -r 742fa2afcad9 tools/annotation_profiler/annotation_profiler_for_interval.py
--- a/tools/annotation_profiler/annotation_profiler_for_interval.py Thu Apr 22 21:11:17 2010 -0400
+++ b/tools/annotation_profiler/annotation_profiler_for_interval.py Fri Apr 23 11:14:26 2010 -0400
@@ -18,12 +18,13 @@
assert sys.version_info[:2] >= ( 2, 4 )
class CachedRangesInFile:
- fmt = '<I'
- fmt_size = struct.calcsize( fmt )
- def __init__( self, filename ):
+ DEFAULT_STRUCT_FORMAT = '<I'
+ def __init__( self, filename, profiler_info ):
self.file_size = os.stat( filename ).st_size
self.file = open( filename, 'rb' )
self.filename = filename
+ self.fmt = profiler_info.get( 'profiler_struct_format', self.DEFAULT_STRUCT_FORMAT )
+ self.fmt_size = int( profiler_info.get( 'profiler_struct_size', struct.calcsize( self.fmt ) ) )
self.length = int( self.file_size / self.fmt_size / 2 )
self._cached_ranges = [ None for i in xrange( self.length ) ]
def __getitem__( self, i ):
@@ -43,9 +44,9 @@
return self.length
class RegionCoverage:
- def __init__( self, filename_base ):
+ def __init__( self, filename_base, profiler_info ):
try:
- self._coverage = CachedRangesInFile( "%s.covered" % filename_base )
+ self._coverage = CachedRangesInFile( "%s.covered" % filename_base, profiler_info )
except Exception, e:
#print "Error loading coverage file %s: %s" % ( "%s.covered" % filename_base, e )
self._coverage = []
@@ -89,12 +90,14 @@
return coverage, region_count, start_index
class CachedCoverageReader:
- def __init__( self, base_file_path, buffer = 10, table_names = None ):
+ def __init__( self, base_file_path, buffer = 10, table_names = None, profiler_info = None ):
self._base_file_path = base_file_path
self._buffer = buffer #number of chromosomes to keep in memory at a time
self._coverage = {}
- if table_names is None: table_names = os.listdir( self._base_file_path )
+ if table_names is None: table_names = [ table_dir for table_dir in os.listdir( self._base_file_path ) if os.path.isdir( os.path.join( self._base_file_path, table_dir ) ) ]
for tablename in table_names: self._coverage[tablename] = {}
+ if profiler_info is None: profiler_info = {}
+ self._profiler_info = profiler_info
def iter_table_coverage_by_region( self, chrom, start, end ):
for tablename, coverage, regions in self.iter_table_coverage_regions_by_region( chrom, start, end ):
yield tablename, coverage
@@ -107,7 +110,7 @@
if len( chromosomes ) >= self._buffer:
#randomly remove one chromosome from this table
del chromosomes[ chromosomes.keys().pop( random.randint( 0, self._buffer - 1 ) ) ]
- chromosomes[chrom] = RegionCoverage( os.path.join ( self._base_file_path, tablename, chrom ) )
+ chromosomes[chrom] = RegionCoverage( os.path.join ( self._base_file_path, tablename, chrom ), self._profiler_info )
coverage, regions, index = chromosomes[chrom].get_coverage_regions_index_overlap( start, end )
yield tablename, coverage, regions, index
@@ -240,19 +243,35 @@
print "%s has max length of %s, exceeded by %s%s." % ( chrom, chrom_lengths.get( chrom ), ", ".join( map( str, regions[:3] ) ), extra_region_info )
class ChromosomeLengths:
- def __init__( self, filename ):
+ def __init__( self, profiler_info ):
self.chroms = {}
- try:
- for line in open( filename ):
- try:
- fields = line.strip().split( "\t" )
- self.chroms[fields[0]] = int( fields[1] )
- except:
- continue
- except:
- pass
+ self.default_bitset_size = int( profiler_info.get( 'bitset_size', bx.bitset.MAX ) )
+ chroms = profiler_info.get( 'chromosomes', None )
+ if chroms:
+ for chrom in chroms.split( ',' ):
+ for fields in chrom.rsplit( '=', 1 ):
+ if len( fields ) == 2:
+ self.chroms[ fields[0] ] = int( fields[1] )
+ else:
+ self.chroms[ fields[0] ] = self.default_bitset_size
def get( self, name ):
- return self.chroms.get( name, bx.bitset.MAX )
+ return self.chroms.get( name, self.default_bitset_size )
+
+def parse_profiler_info( filename ):
+ profiler_info = {}
+ try:
+ for line in open( filename ):
+ fields = line.rstrip( '\n\r' ).split( '\t', 1 )
+ if len( fields ) == 2:
+ if fields[0] in profiler_info:
+ if not isinstance( profiler_info[ fields[0] ], list ):
+ profiler_info[ fields[0] ] = [ profiler_info[ fields[0] ] ]
+ profiler_info[ fields[0] ].append( fields[1] )
+ else:
+ profiler_info[ fields[0] ] = fields[1]
+ except:
+ pass #likely missing file
+ return profiler_info
def __main__():
parser = optparse.OptionParser()
@@ -294,16 +313,10 @@
help='Path to profiled data for this organism'
)
parser.add_option(
- '-l','--lengths',
- dest='lengths',
- type='str',default='test-data/shared/ucsc/hg18.len',
- help='Path to chromosome lengths data for this organism'
- )
- parser.add_option(
'-t','--table_names',
dest='table_names',
type='str',default='None',
- help='Path to profiled data for this organism'
+ help='Table names requested'
)
parser.add_option(
'-i','--input',
@@ -327,14 +340,19 @@
options, args = parser.parse_args()
+ #get profiler_info
+ profiler_info = parse_profiler_info( os.path.join( options.path, 'profiler_info.txt' ) )
+
table_names = options.table_names.split( "," )
if table_names == ['None']: table_names = None
- coverage_reader = CachedCoverageReader( options.path, buffer = options.buffer, table_names = table_names )
+ coverage_reader = CachedCoverageReader( options.path, buffer = options.buffer, table_names = table_names, profiler_info = profiler_info )
if options.summary:
- profile_summary( options.interval_filename, options.chrom_col - 1, options.start_col - 1, options.end_col -1, options.out_filename, options.keep_empty, coverage_reader, ChromosomeLengths( options.lengths ) )
+ profile_summary( options.interval_filename, options.chrom_col - 1, options.start_col - 1, options.end_col -1, options.out_filename, options.keep_empty, coverage_reader, ChromosomeLengths( profiler_info ) )
else:
profile_per_interval( options.interval_filename, options.chrom_col - 1, options.start_col - 1, options.end_col -1, options.out_filename, options.keep_empty, coverage_reader )
+ #print out data version info
+ print 'Data version (%s:%s:%s)' % ( profiler_info.get( 'dbkey', 'unknown' ), profiler_info.get( 'profiler_hash', 'unknown' ), profiler_info.get( 'dump_time', 'unknown' ) )
if __name__ == "__main__": __main__()
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/c37de7a983e7
changeset: 3682:c37de7a983e7
user: rc
date: Thu Apr 22 21:11:17 2010 -0400
description:
lims: added request_types permissions
diffstat:
lib/galaxy/model/__init__.py | 19 ++
lib/galaxy/model/mapping.py | 15 +
lib/galaxy/model/migrate/versions/0045_request_type_permissions_table.py | 34 +++
lib/galaxy/security/__init__.py | 38 ++++-
lib/galaxy/web/controllers/requests.py | 3 +-
lib/galaxy/web/controllers/requests_admin.py | 59 +++++-
templates/admin/requests/request_type_permissions.mako | 92 ++++++++++
templates/webapps/galaxy/base_panels.mako | 2 +-
8 files changed, 253 insertions(+), 9 deletions(-)
diffs (392 lines):
diff -r 1b30f5fa152b -r c37de7a983e7 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py Thu Apr 22 16:05:08 2010 -0400
+++ b/lib/galaxy/model/__init__.py Thu Apr 22 21:11:17 2010 -0400
@@ -73,6 +73,18 @@
if can_show:
libraries[ library ] = hidden_folder_ids
return libraries
+ def accessible_request_types(self, trans):
+ # get all permitted libraries for this user
+ all_rt_list = trans.sa_session.query( trans.app.model.RequestType ) \
+ .filter( trans.app.model.RequestType.table.c.deleted == False ) \
+ .order_by( trans.app.model.RequestType.name )
+ roles = self.all_roles()
+ rt_list = []
+ for rt in all_rt_list:
+ for permission in rt.actions:
+ if permission.role.id in [r.id for r in roles]:
+ rt_list.append(rt)
+ return list(set(rt_list))
class Job( object ):
"""
@@ -1445,6 +1457,7 @@
self.comment = comment
class RequestType( object ):
+ permitted_actions = get_permitted_actions( filter='REQUEST_TYPE' )
def __init__(self, name=None, desc=None, request_form=None, sample_form=None,
datatx_info=None):
self.name = name
@@ -1452,6 +1465,12 @@
self.request_form = request_form
self.sample_form = sample_form
self.datatx_info = datatx_info
+
+class RequestTypePermissions( object ):
+ def __init__( self, action, request_type, role ):
+ self.action = action
+ self.request_type = request_type
+ self.role = role
class Sample( object ):
transfer_status = Bunch( NOT_STARTED = 'Not started',
diff -r 1b30f5fa152b -r c37de7a983e7 lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py Thu Apr 22 16:05:08 2010 -0400
+++ b/lib/galaxy/model/mapping.py Thu Apr 22 21:11:17 2010 -0400
@@ -628,6 +628,14 @@
Column( "datatx_info", JSONType() ),
Column( "deleted", Boolean, index=True, default=False ) )
+RequestTypePermissions.table = Table( "request_type_permissions", metadata,
+ Column( "id", Integer, primary_key=True ),
+ Column( "create_time", DateTime, default=now ),
+ Column( "update_time", DateTime, default=now, onupdate=now ),
+ Column( "action", TEXT ),
+ Column( "request_type_id", Integer, ForeignKey( "request_type.id" ), nullable=True, index=True ),
+ Column( "role_id", Integer, ForeignKey( "role.id" ), index=True ) )
+
FormValues.table = Table('form_values', metadata,
Column( "id", Integer, primary_key=True),
Column( "create_time", DateTime, default=now ),
@@ -923,6 +931,13 @@
primaryjoin=( RequestType.table.c.sample_form_id == FormDefinition.table.c.id ) ),
) )
+assign_mapper( context, RequestTypePermissions, RequestTypePermissions.table,
+ properties=dict(
+ request_type=relation( RequestType, backref="actions" ),
+ role=relation( Role, backref="request_type_actions" )
+ )
+)
+
assign_mapper( context, FormDefinition, FormDefinition.table,
properties=dict( current=relation( FormDefinitionCurrent,
primaryjoin=( FormDefinition.table.c.form_definition_current_id == FormDefinitionCurrent.table.c.id ) )
diff -r 1b30f5fa152b -r c37de7a983e7 lib/galaxy/model/migrate/versions/0045_request_type_permissions_table.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/model/migrate/versions/0045_request_type_permissions_table.py Thu Apr 22 21:11:17 2010 -0400
@@ -0,0 +1,34 @@
+"""
+Migration script to add the request_type_permissions table.
+"""
+
+from sqlalchemy import *
+from migrate import *
+from migrate.changeset import *
+
+import datetime
+now = datetime.datetime.utcnow
+
+import logging
+log = logging.getLogger( __name__ )
+
+metadata = MetaData( migrate_engine )
+
+RequestTypePermissions_table = Table( "request_type_permissions", metadata,
+ Column( "id", Integer, primary_key=True ),
+ Column( "create_time", DateTime, default=now ),
+ Column( "update_time", DateTime, default=now, onupdate=now ),
+ Column( "action", TEXT ),
+ Column( "request_type_id", Integer, ForeignKey( "request_type.id" ), nullable=True, index=True ),
+ Column( "role_id", Integer, ForeignKey( "role.id" ), index=True ) )
+
+def upgrade():
+ print __doc__
+ metadata.reflect()
+ try:
+ RequestTypePermissions_table.create()
+ except Exception, e:
+ log.debug( "Creating request_type_permissions table failed: %s" % str( e ) )
+
+def downgrade():
+ pass
\ No newline at end of file
diff -r 1b30f5fa152b -r c37de7a983e7 lib/galaxy/security/__init__.py
--- a/lib/galaxy/security/__init__.py Thu Apr 22 16:05:08 2010 -0400
+++ b/lib/galaxy/security/__init__.py Thu Apr 22 21:11:17 2010 -0400
@@ -24,7 +24,10 @@
LIBRARY_ACCESS = Action( "access library", "Restrict access to this library to only role members", "restrict" ),
LIBRARY_ADD = Action( "add library item", "Role members can add library items to this library item", "grant" ),
LIBRARY_MODIFY = Action( "modify library item", "Role members can modify this library item", "grant" ),
- LIBRARY_MANAGE = Action( "manage library permissions", "Role members can manage roles associated with permissions on this library item", "grant" )
+ LIBRARY_MANAGE = Action( "manage library permissions", "Role members can manage roles associated with permissions on this library item", "grant" ),
+ # Request type permissions
+ REQUEST_TYPE_ACCESS = Action( "access request_type", "Restrict access to this request_type to only role members", "restrict" )
+
)
def get_action( self, name, default=None ):
"""Get a permitted action by its dict key or action name"""
@@ -754,6 +757,39 @@
else:
hidden_folder_ids = '%d' % sub_folder.id
return False, hidden_folder_ids
+ #
+ # RequestType Permissions
+ #
+ def can_access_request_type( self, roles, request_type ):
+ action = self.permitted_actions.REQUEST_TYPE_ACCESS
+ request_type_actions = []
+ for permission in request_type.actions:
+ if permission.action == action.action:
+ request_type_actions.append(permission)
+ if not request_type_actions:
+ return action.model == 'restrict'
+ ret_val = False
+ for item_action in item_actions:
+ if item_action.role in roles:
+ ret_val = True
+ break
+ return ret_val
+ def set_request_type_permissions( self, request_type, permissions={} ):
+ # Set new permissions on request_type, eliminating all current permissions
+ for role_assoc in request_type.actions:
+ self.sa_session.delete( role_assoc )
+ # Add the new permissions on request_type
+ item_class = self.model.RequestType
+ permission_class = self.model.RequestTypePermissions
+ for action, roles in permissions.items():
+ if isinstance( action, Action ):
+ action = action.action
+ for role_assoc in [ permission_class( action, request_type, role ) for role in roles ]:
+ self.sa_session.add( role_assoc )
+ self.sa_session.flush()
+
+
+
class HostAgent( RBACAgent ):
"""
diff -r 1b30f5fa152b -r c37de7a983e7 lib/galaxy/web/controllers/requests.py
--- a/lib/galaxy/web/controllers/requests.py Thu Apr 22 16:05:08 2010 -0400
+++ b/lib/galaxy/web/controllers/requests.py Thu Apr 22 21:11:17 2010 -0400
@@ -623,8 +623,7 @@
details=details,
edit_mode=edit_mode)
def __select_request_type(self, trans, rtid):
- requesttype_list = trans.sa_session.query( trans.app.model.RequestType )\
- .order_by( trans.app.model.RequestType.name.asc() )
+ requesttype_list = trans.user.accessible_request_types(trans)
rt_ids = ['none']
for rt in requesttype_list:
if not rt.deleted:
diff -r 1b30f5fa152b -r c37de7a983e7 lib/galaxy/web/controllers/requests_admin.py
--- a/lib/galaxy/web/controllers/requests_admin.py Thu Apr 22 16:05:08 2010 -0400
+++ b/lib/galaxy/web/controllers/requests_admin.py Thu Apr 22 21:11:17 2010 -0400
@@ -195,7 +195,8 @@
visible=False,
filterable="standard" ) )
operations = [
- #grids.GridOperation( "Update", allow_multiple=False, condition=( lambda item: not item.deleted ) ),
+ grids.GridOperation( "Permissions", allow_multiple=False, condition=( lambda item: not item.deleted ) ),
+ #grids.GridOperation( "Clone", allow_multiple=False, condition=( lambda item: not item.deleted ) ),
grids.GridOperation( "Delete", allow_multiple=True, condition=( lambda item: not item.deleted ) ),
grids.GridOperation( "Undelete", condition=( lambda item: item.deleted ) ),
]
@@ -258,6 +259,7 @@
'''
List all request made by the current user
'''
+ #self.__sample_datasets(trans, **kwd)
if 'operation' in kwd:
operation = kwd['operation'].lower()
if not kwd.get( 'id', None ):
@@ -534,8 +536,7 @@
#---- Request Creation ----------------------------------------------------------
#
def __select_request_type(self, trans, rtid):
- requesttype_list = trans.sa_session.query( trans.app.model.RequestType )\
- .order_by( trans.app.model.RequestType.name.asc() )
+ requesttype_list = trans.user.accessible_request_types(trans)
rt_ids = ['none']
for rt in requesttype_list:
if not rt.deleted:
@@ -1771,6 +1772,25 @@
dataset_index=dataset_index,
message=message,
status=status)
+
+# def __sample_datasets(self, trans, **kwd):
+# samples = trans.sa_session.query( trans.app.model.Sample ).all()
+# for s in samples:
+# if s.dataset_files:
+# newdf = []
+# for df in s.dataset_files:
+# if type(s.dataset_files[0]) == type([1,2]):
+# filepath = df[0]
+# status = df[1]
+# newdf.append(dict(filepath=filepath,
+# status=status,
+# name=filepath.split('/')[-1],
+# error_msg='',
+# size='Unknown'))
+# s.dataset_files = newdf
+# trans.sa_session.add( s )
+# trans.sa_session.flush()
+#
##
#### Request Type Stuff ###################################################
##
@@ -1792,8 +1812,10 @@
return self.__delete_request_type( trans, **kwd )
elif operation == "undelete":
return self.__undelete_request_type( trans, **kwd )
-# elif operation == "update":
-# return self.__edit_request( trans, **kwd )
+ elif operation == "clone":
+ return self.__clone_request_type( trans, **kwd )
+ elif operation == "permissions":
+ return self.__show_request_type_permissions( trans, **kwd )
# Render the grid view
return self.requesttype_grid( trans, **kwd )
def __view_request_type(self, trans, **kwd):
@@ -1992,3 +2014,30 @@
action='manage_request_types',
message='%i request type(s) has been undeleted' % len(id_list),
status='done') )
+ def __show_request_type_permissions(self, trans, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ try:
+ rt = trans.sa_session.query( trans.app.model.RequestType ).get( trans.security.decode_id(kwd['id']) )
+ except:
+ return trans.response.send_redirect( web.url_for( controller='requests_admin',
+ action='manage_request_types',
+ status='error',
+ message="Invalid requesttype ID") )
+ roles = trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name )
+ if params.get( 'update_roles_button', False ):
+ permissions = {}
+ for k, v in trans.app.model.RequestType.permitted_actions.items():
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
+ permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
+ trans.app.security_agent.set_request_type_permissions( rt, permissions )
+ trans.sa_session.refresh( rt )
+ message = "Permissions updated for request type '%s'" % rt.name
+ return trans.fill_template( '/admin/requests/request_type_permissions.mako',
+ request_type=rt,
+ roles=roles,
+ status=status,
+ message=message)
diff -r 1b30f5fa152b -r c37de7a983e7 templates/admin/requests/request_type_permissions.mako
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/templates/admin/requests/request_type_permissions.mako Thu Apr 22 21:11:17 2010 -0400
@@ -0,0 +1,92 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<script type="text/javascript">
+ $( document ).ready( function () {
+ $( '.role_add_button' ).click( function() {
+ var action = this.id.substring( 0, this.id.lastIndexOf( '_add_button' ) )
+ var in_select = '#' + action + '_in_select';
+ var out_select = '#' + action + '_out_select';
+ return !$( out_select + ' option:selected' ).remove().appendTo( in_select );
+ });
+ $( '.role_remove_button' ).click( function() {
+ var action = this.id.substring( 0, this.id.lastIndexOf( '_remove_button' ) )
+ var in_select = '#' + action + '_in_select';
+ var out_select = '#' + action + '_out_select';
+ return !$( in_select + ' option:selected' ).remove().appendTo( out_select );
+ });
+ $( 'form#edit_role_associations' ).submit( function() {
+ $( '.in_select option' ).each(function( i ) {
+ $( this ).attr( "selected", "selected" );
+ });
+ });
+ });
+</script>
+
+
+<div class="toolForm">
+ <div class="toolFormTitle">Manage permissions on "${request_type.name}"</div>
+ <div class="toolFormBody">
+ <form name="request_type_permissions" id="request_type_permissions" action="${h.url_for( controller='requests_admin', action='manage_request_types', operation="permissions", id=trans.security.encode_id(request_type.id))}" method="post">
+ <div class="form-row">
+## %for k, v in permitted_actions:
+## %if k not in do_not_render:
+## <div class="form-row">
+## ${render_select( current_actions, k, v, all_roles )}
+## </div>
+## %endif
+## %endfor
+## <%def name="render_select( current_actions, action_key, action, all_roles )">
+ <%
+ obj_name = request_type.name
+ current_actions = request_type.actions
+ permitted_actions = trans.app.model.RequestType.permitted_actions.items()
+ action = trans.app.model.RequestType.permitted_actions.REQUEST_TYPE_ACCESS
+ obj_str = 'request_type %s' % obj_name
+ obj_type = 'request_type'
+ all_roles = roles
+ action_key = 'REQUEST_TYPE_ACCESS'
+
+ import sets
+ in_roles = sets.Set()
+ for a in current_actions:
+ if a.action == action.action:
+ in_roles.add( a.role )
+ out_roles = filter( lambda x: x not in in_roles, all_roles )
+ %>
+ <p>
+ <b>${action.action}:</b> ${action.description}
+ </p>
+ <div style="width: 100%; white-space: nowrap;">
+ <div style="float: left; width: 50%;">
+ Roles associated:<br/>
+ <select name="${action_key}_in" id="${action_key}_in_select" class="in_select" style="max-width: 98%; width: 98%; height: 150px; font-size: 100%;" multiple>
+ %for role in in_roles:
+ <option value="${role.id}">${role.name}</option>
+ %endfor
+ </select> <br/>
+ <div style="width: 98%; text-align: right"><input type="submit" id="${action_key}_remove_button" class="role_remove_button" value=">>"/></div>
+ </div>
+ <div style="width: 50%;">
+ Roles not associated:<br/>
+ <select name="${action_key}_out" id="${action_key}_out_select" style="max-width: 98%; width: 98%; height: 150px; font-size: 100%;" multiple>
+ %for role in out_roles:
+ <option value="${role.id}">${role.name}</option>
+ %endfor
+ </select> <br/>
+ <input type="submit" id="${action_key}_add_button" class="role_add_button" value="<<"/>
+ </div>
+ </div>
+## </%def>
+ </div>
+ <div class="form-row">
+ <input type="submit" name="update_roles_button" value="Save"/>
+ </div>
+ </form>
+ </div>
+</div>
\ No newline at end of file
diff -r 1b30f5fa152b -r c37de7a983e7 templates/webapps/galaxy/base_panels.mako
--- a/templates/webapps/galaxy/base_panels.mako Thu Apr 22 16:05:08 2010 -0400
+++ b/templates/webapps/galaxy/base_panels.mako Thu Apr 22 21:11:17 2010 -0400
@@ -36,7 +36,7 @@
${tab( "libraries", "Data Libraries", h.url_for( controller='/library', action='index' ))}
%endif
- %if trans.user and trans.request_types():
+ %if trans.user and trans.user.accessible_request_types(trans):
<td class="tab">
<a>Lab</a>
<div class="submenu">
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/1b30f5fa152b
changeset: 3681:1b30f5fa152b
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Thu Apr 22 16:05:08 2010 -0400
description:
GTF to BEDGraph converter.
diffstat:
test-data/gtf2bedgraph_in.gtf | 100 ++++++++++++++++++++++++++++
test-data/gtf2bedgraph_out.bedgraph | 101 +++++++++++++++++++++++++++++
tool_conf.xml.sample | 1 +
tools/filters/gtf2bedgraph.xml | 79 ++++++++++++++++++++++
tools/filters/gtf_to_bedgraph_converter.py | 73 ++++++++++++++++++++
5 files changed, 354 insertions(+), 0 deletions(-)
diffs (381 lines):
diff -r 3445ca17a4c5 -r 1b30f5fa152b test-data/gtf2bedgraph_in.gtf
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/gtf2bedgraph_in.gtf Thu Apr 22 16:05:08 2010 -0400
@@ -0,0 +1,100 @@
+chr1 Cufflinks exon 36425950 36426026 1000 - . gene_id "uc007aqa.1"; transcript_id "uc007aqa.1"; exon_number "21"; FPKM "4.8386844109"; frac "0.515875"; conf_lo "0.000000"; conf_hi "9.779040"; cov "0.274837";
+chr1 Cufflinks exon 46891972 46892996 1000 - . gene_id "uc007axc.1"; transcript_id "uc007axc.1"; exon_number "9"; FPKM "8.4688567539"; frac "1.000000"; conf_lo "6.667227"; conf_hi "10.270487"; cov "0.481031";
+chr1 Cufflinks exon 71654478 71654594 1000 - . gene_id "uc007bkb.1"; transcript_id "uc007bkb.1"; exon_number "4"; FPKM "0.4686878995"; frac "0.186704"; conf_lo "0.300747"; conf_hi "0.636629"; cov "0.026621";
+chr1 Cufflinks transcript 72629845 72679706 1000 + . gene_id "uc007bks.1"; transcript_id "uc007bks.1"; FPKM "4.0695297327"; frac "1.000000"; conf_lo "2.473329"; conf_hi "5.665731"; cov "0.231149";
+chr1 Cufflinks exon 75531753 75532000 1000 + . gene_id "uc007bpt.1"; transcript_id "uc007bpt.1"; exon_number "24"; FPKM "3.6392661141"; frac "1.000000"; conf_lo "2.391008"; conf_hi "4.887524"; cov "0.206710";
+chr1 Cufflinks exon 123389482 123389564 1000 + . gene_id "uc007cju.1"; transcript_id "uc007cju.1"; exon_number "20"; FPKM "0.9948773061"; frac "1.000000"; conf_lo "0.105032"; conf_hi "1.884723"; cov "0.056509";
+chr1 Cufflinks exon 129625990 129626119 1000 + . gene_id "uc007ckv.1"; transcript_id "uc007ckv.1"; exon_number "1"; FPKM "0.0003267777"; frac "0.004692"; conf_lo "0.000000"; conf_hi "0.000915"; cov "0.000019";
+chr1 Cufflinks exon 132059397 132059512 1000 + . gene_id "uc007clw.1"; transcript_id "uc007clw.1"; exon_number "7"; FPKM "0.2051423010"; frac "0.886787"; conf_lo "0.000000"; conf_hi "0.509199"; cov "0.011652";
+chr1 Cufflinks exon 175865141 175865308 1000 - . gene_id "uc007dsf.1"; transcript_id "uc007dsf.1"; exon_number "5"; FPKM "0.6544444010"; frac "1.000000"; conf_lo "0.068952"; conf_hi "1.239936"; cov "0.037172";
+chr10 Cufflinks transcript 7399380 7400956 1000 - . gene_id "uc007eie.1"; transcript_id "uc007eie.1"; FPKM "2.1099978681"; frac "1.000000"; conf_lo "0.514989"; conf_hi "3.705006"; cov "0.119848";
+chr10 Cufflinks exon 79784826 79784954 1000 - . gene_id "uc007gcr.1"; transcript_id "uc007gcr.1"; exon_number "2"; FPKM "1.2054582676"; frac "1.000000"; conf_lo "0.000000"; conf_hi "2.597402"; cov "0.068470";
+chr10 Cufflinks exon 79820729 79820836 1000 + . gene_id "uc007gcy.1"; transcript_id "uc007gcy.1"; exon_number "2"; FPKM "1.8177911161"; frac "1.000000"; conf_lo "0.532419"; conf_hi "3.103164"; cov "0.103250";
+chr10 Cufflinks transcript 105907395 106369573 1000 + . gene_id "uc007gyr.1"; transcript_id "uc007gyr.1"; FPKM "4.2493607936"; frac "0.247216"; conf_lo "3.727223"; conf_hi "4.771499"; cov "0.241364";
+chr10 Cufflinks exon 119487061 119487172 1000 + . gene_id "uc007hep.1"; transcript_id "uc007hep.1"; exon_number "10"; FPKM "4.3105966126"; frac "0.341843"; conf_lo "3.127417"; conf_hi "5.493776"; cov "0.244842";
+chr11 Cufflinks exon 29097093 29097209 1000 + . gene_id "uc007igs.1"; transcript_id "uc007igs.1"; exon_number "7"; FPKM "4.2530782301"; frac "1.000000"; conf_lo "2.700074"; conf_hi "5.806083"; cov "0.241575";
+chr11 Cufflinks exon 69404158 69404264 1000 + . gene_id "uc007jqm.1"; transcript_id "uc007jqm.1"; exon_number "10"; FPKM "18.7450971965"; frac "0.685277"; conf_lo "11.773851"; conf_hi "25.716343"; cov "1.064721";
+chr11 Cufflinks transcript 98249986 98261804 1000 - . gene_id "uc007lgh.1"; transcript_id "uc007lgh.1"; FPKM "2.1571271227"; frac "1.000000"; conf_lo "0.856331"; conf_hi "3.457924"; cov "0.122525";
+chr11 Cufflinks exon 102210141 102211681 1000 - . gene_id "uc007lrp.1"; transcript_id "uc007lrp.1"; exon_number "1"; FPKM "0.8688186006"; frac "1.000000"; conf_lo "0.254471"; conf_hi "1.483166"; cov "0.049349";
+chr11 Cufflinks transcript 105926400 105927243 1000 - . gene_id "uc007lya.1"; transcript_id "uc007lya.1"; FPKM "3.6706747247"; frac "1.000000"; conf_lo "0.000000"; conf_hi "8.861793"; cov "0.208494";
+chr11 Cufflinks exon 106633966 106634066 1000 - . gene_id "uc007lzm.1"; transcript_id "uc007lzm.1"; exon_number "2"; FPKM "2.4729108195"; frac "0.689555"; conf_lo "0.805433"; conf_hi "4.140389"; cov "0.140461";
+chr11 Cufflinks exon 120472427 120472492 1000 - . gene_id "uc007mtq.1"; transcript_id "uc007mtq.1"; exon_number "3"; FPKM "10.2380258574"; frac "0.356865"; conf_lo "4.499395"; conf_hi "15.976656"; cov "0.581520";
+chr12 Cufflinks exon 100112717 100112852 1000 - . gene_id "uc007orn.1"; transcript_id "uc007orn.1"; exon_number "39"; FPKM "1.8669402513"; frac "0.154118"; conf_lo "1.707295"; conf_hi "2.026586"; cov "0.106042";
+chr13 Cufflinks exon 8889564 8891614 1000 + . gene_id "uc007pkn.1"; transcript_id "uc007pkn.1"; exon_number "5"; FPKM "9.4402522582"; frac "1.000000"; conf_lo "6.745038"; conf_hi "12.135466"; cov "0.536206";
+chr13 Cufflinks exon 13756207 13756380 1000 + . gene_id "uc007pmj.1"; transcript_id "uc007pmj.1"; exon_number "18"; FPKM "0.0574771218"; frac "0.793101"; conf_lo "0.000000"; conf_hi "0.140705"; cov "0.003265";
+chr13 Cufflinks exon 93243918 93244083 1000 - . gene_id "uc007rkp.1"; transcript_id "uc007rkp.1"; exon_number "4"; FPKM "6.9802111138"; frac "1.000000"; conf_lo "3.858566"; conf_hi "10.101856"; cov "0.396476";
+chr14 Cufflinks exon 13130096 13130170 1000 + . gene_id "uc007sfq.1"; transcript_id "uc007sfq.1"; exon_number "4"; FPKM "4.0381928600"; frac "1.000000"; conf_lo "2.254366"; conf_hi "5.822020"; cov "0.229369";
+chr14 Cufflinks exon 32036106 32036250 1000 + . gene_id "uc007sxe.1"; transcript_id "uc007sxe.1"; exon_number "10"; FPKM "0.1289615781"; frac "1.000000"; conf_lo "0.000000"; conf_hi "0.386885"; cov "0.007325";
+chr14 Cufflinks exon 56517080 56517223 1000 - . gene_id "uc007ubd.1"; transcript_id "uc007ubd.1"; exon_number "2"; FPKM "15.7683764379"; frac "0.548796"; conf_lo "8.949920"; conf_hi "22.586833"; cov "0.895643";
+chr14 Cufflinks exon 62950942 62951013 1000 + . gene_id "uc007ugl.1"; transcript_id "uc007ugl.1"; exon_number "1"; FPKM "10.1138803585"; frac "1.000000"; conf_lo "6.480867"; conf_hi "13.746893"; cov "0.574468";
+chr14 Cufflinks exon 66479007 66479052 1000 + . gene_id "uc007ujq.1"; transcript_id "uc007ujq.1"; exon_number "8"; FPKM "14.3011267395"; frac "1.000000"; conf_lo "10.806805"; conf_hi "17.795448"; cov "0.812304";
+chr14 Cufflinks exon 70961619 70961783 1000 + . gene_id "uc007uoj.1"; transcript_id "uc007uoj.1"; exon_number "7"; FPKM "2.0814553995"; frac "1.000000"; conf_lo "1.231705"; conf_hi "2.931206"; cov "0.118227";
+chr14 Cufflinks exon 96679222 96679434 1000 - . gene_id "uc007uuq.1"; transcript_id "uc007uuq.1"; exon_number "7"; FPKM "1.7614342028"; frac "1.000000"; conf_lo "0.851833"; conf_hi "2.671035"; cov "0.100049";
+chr14 Cufflinks exon 99504388 99504488 1000 + . gene_id "uc007uvc.1"; transcript_id "uc007uvc.1"; exon_number "3"; FPKM "3.1573312214"; frac "0.277705"; conf_lo "2.620155"; conf_hi "3.694508"; cov "0.179336";
+chr15 Cufflinks exon 12777808 12777962 1000 + . gene_id "uc007vic.1"; transcript_id "uc007vic.1"; exon_number "6"; FPKM "12.7118803258"; frac "0.653301"; conf_lo "7.807708"; conf_hi "17.616053"; cov "0.722034";
+chr15 Cufflinks exon 28200049 28200282 1000 + . gene_id "uc007vjy.1"; transcript_id "uc007vjy.1"; exon_number "19"; FPKM "0.0608801712"; frac "1.000000"; conf_lo "0.000000"; conf_hi "0.146978"; cov "0.003458";
+chr15 Cufflinks exon 34434714 34434889 1000 + . gene_id "uc007vlv.1"; transcript_id "uc007vlv.1"; exon_number "4"; FPKM "2.1698982510"; frac "1.000000"; conf_lo "1.049368"; conf_hi "3.290429"; cov "0.123250";
+chr15 Cufflinks transcript 51709056 51716160 1000 + . gene_id "uc007vrc.1"; transcript_id "uc007vrc.1"; FPKM "5.0213279245"; frac "1.000000"; conf_lo "3.187798"; conf_hi "6.854858"; cov "0.285211";
+chr15 Cufflinks exon 54880182 54880296 1000 - . gene_id "uc007vrt.1"; transcript_id "uc007vrt.1"; exon_number "14"; FPKM "9.7267082384"; frac "1.000000"; conf_lo "7.809774"; conf_hi "11.643643"; cov "0.552477";
+chr15 Cufflinks exon 59176893 59177072 1000 - . gene_id "uc007vxs.1"; transcript_id "uc007vxs.1"; exon_number "11"; FPKM "4.5392702144"; frac "1.000000"; conf_lo "2.723562"; conf_hi "6.354978"; cov "0.257830";
+chr15 Cufflinks exon 76426650 76426779 1000 - . gene_id "uc007wla.1"; transcript_id "uc007wla.1"; exon_number "3"; FPKM "3.5730073595"; frac "0.230550"; conf_lo "2.576136"; conf_hi "4.569879"; cov "0.202947";
+chr15 Cufflinks exon 76533504 76533613 1000 + . gene_id "uc007wlt.1"; transcript_id "uc007wlt.1"; exon_number "4"; FPKM "3.3395072810"; frac "0.491112"; conf_lo "2.499197"; conf_hi "4.179818"; cov "0.189684";
+chr15 Cufflinks exon 88963183 88963261 1000 - . gene_id "uc007xfl.1"; transcript_id "uc007xfl.1"; exon_number "3"; FPKM "1.5871531781"; frac "1.000000"; conf_lo "0.291248"; conf_hi "2.883058"; cov "0.090150";
+chr15 Cufflinks exon 102455470 102455519 1000 - . gene_id "uc007xwk.1"; transcript_id "uc007xwk.1"; exon_number "13"; FPKM "0.2873090379"; frac "0.161741"; conf_lo "0.099159"; conf_hi "0.475459"; cov "0.016319";
+chr16 Cufflinks transcript 3979123 3982204 1000 - . gene_id "uc007xzf.1"; transcript_id "uc007xzf.1"; FPKM "4.1992546925"; frac "0.467884"; conf_lo "2.835257"; conf_hi "5.563252"; cov "0.238518";
+chr15 Cufflinks exon 102313591 102313719 1000 + . gene_id "uc007xvy.2"; transcript_id "uc007xvy.2"; exon_number "7"; FPKM "37.5792165910"; frac "0.297738"; conf_lo "34.688492"; conf_hi "40.469941"; cov "2.134498";
+chr16 Cufflinks exon 4608598 4608818 1000 + . gene_id "uc007xzw.1"; transcript_id "uc007xzw.1"; exon_number "2"; FPKM "5.7793602049"; frac "1.000000"; conf_lo "4.036818"; conf_hi "7.521903"; cov "0.328267";
+chr16 Cufflinks exon 20541820 20541939 1000 + . gene_id "uc007ypy.1"; transcript_id "uc007ypy.1"; exon_number "7"; FPKM "68.0268643583"; frac "1.000000"; conf_lo "60.085498"; conf_hi "75.968231"; cov "3.863924";
+chr17 Cufflinks transcript 24857054 24858867 1000 + . gene_id "uc008axy.1"; transcript_id "uc008axy.1"; FPKM "22.0141466642"; frac "1.000000"; conf_lo "15.369306"; conf_hi "28.658988"; cov "1.250403";
+chr17 Cufflinks exon 25379604 25380686 1000 - . gene_id "uc008bah.1"; transcript_id "uc008bah.1"; exon_number "1"; FPKM "1.7458387165"; frac "0.226783"; conf_lo "1.488719"; conf_hi "2.002959"; cov "0.099164";
+chr17 Cufflinks exon 27159196 27159462 1000 + . gene_id "uc008bfe.1"; transcript_id "uc008bfe.1"; exon_number "2"; FPKM "1.7334774900"; frac "0.118977"; conf_lo "1.272113"; conf_hi "2.194842"; cov "0.098461";
+chr18 Cufflinks exon 34787707 34787836 1000 + . gene_id "uc008ela.1"; transcript_id "uc008ela.1"; exon_number "7"; FPKM "5.0638001964"; frac "0.237331"; conf_lo "4.342098"; conf_hi "5.785503"; cov "0.287624";
+chr18 Cufflinks exon 61371052 61371250 1000 - . gene_id "uc008fbu.1"; transcript_id "uc008fbu.1"; exon_number "4"; FPKM "0.1230526474"; frac "1.000000"; conf_lo "0.000000"; conf_hi "0.369158"; cov "0.006989";
+chr18 Cufflinks exon 61167370 61167501 1000 - . gene_id "uc008fbi.1"; transcript_id "uc008fbi.1"; exon_number "12"; FPKM "2.4172869897"; frac "1.000000"; conf_lo "1.244731"; conf_hi "3.589843"; cov "0.137302";
+chr18 Cufflinks exon 86630592 86630719 1000 + . gene_id "uc008fuz.1"; transcript_id "uc008fuz.1"; exon_number "6"; FPKM "2.2892787327"; frac "1.000000"; conf_lo "1.065608"; conf_hi "3.512950"; cov "0.130031";
+chr19 Cufflinks exon 5603634 5603715 1000 - . gene_id "uc008gea.1"; transcript_id "uc008gea.1"; exon_number "2"; FPKM "2.1837193523"; frac "0.163446"; conf_lo "1.715120"; conf_hi "2.652319"; cov "0.124035";
+chr2 Cufflinks exon 28404475 28404676 1000 + . gene_id "uc008iyn.1"; transcript_id "uc008iyn.1"; exon_number "15"; FPKM "10.9087431164"; frac "0.368384"; conf_lo "4.356515"; conf_hi "17.460972"; cov "0.619616";
+chr2 Cufflinks exon 29770254 29770439 1000 + . gene_id "uc008jal.1"; transcript_id "uc008jal.1"; exon_number "12"; FPKM "7.2973656902"; frac "0.685974"; conf_lo "5.778526"; conf_hi "8.816206"; cov "0.414490";
+chr2 Cufflinks exon 30002172 30002382 1000 + . gene_id "uc008jbj.1"; transcript_id "uc008jbj.1"; exon_number "8"; FPKM "12.8769808138"; frac "1.000000"; conf_lo "10.220662"; conf_hi "15.533299"; cov "0.731412";
+chr2 Cufflinks exon 32076600 32076704 1000 + . gene_id "uc008jeo.1"; transcript_id "uc008jeo.1"; exon_number "21"; FPKM "43.8860660433"; frac "0.911093"; conf_lo "40.407190"; conf_hi "47.364942"; cov "2.492727";
+chr2 Cufflinks exon 32546710 32546774 1000 - . gene_id "uc008jgm.1"; transcript_id "uc008jgm.1"; exon_number "12"; FPKM "8.1366623064"; frac "1.000000"; conf_lo "5.496780"; conf_hi "10.776544"; cov "0.462162";
+chr2 Cufflinks exon 35574280 35574458 1000 + . gene_id "uc008jkv.1"; transcript_id "uc008jkv.1"; exon_number "6"; FPKM "2.0012109810"; frac "0.141121"; conf_lo "1.688896"; conf_hi "2.313526"; cov "0.113669";
+chr2 Cufflinks exon 117127697 117127757 1000 - . gene_id "uc008lrl.1"; transcript_id "uc008lrl.1"; exon_number "14"; FPKM "1.6760710643"; frac "0.685093"; conf_lo "1.109659"; conf_hi "2.242483"; cov "0.095201";
+chr2 Cufflinks exon 122435405 122435623 1000 - . gene_id "uc008maw.1"; transcript_id "uc008maw.1"; exon_number "8"; FPKM "10.5679023498"; frac "1.000000"; conf_lo "7.636894"; conf_hi "13.498911"; cov "0.600257";
+chr2 Cufflinks exon 130265172 130265261 1000 + . gene_id "uc008mja.1"; transcript_id "uc008mja.1"; exon_number "9"; FPKM "3.6318426438"; frac "0.287992"; conf_lo "2.815837"; conf_hi "4.447848"; cov "0.206289";
+chr2 Cufflinks exon 152702303 152702428 1000 + . gene_id "uc008ngq.1"; transcript_id "uc008ngq.1"; exon_number "7"; FPKM "2.5312142816"; frac "0.526901"; conf_lo "1.108909"; conf_hi "3.953519"; cov "0.143773";
+chr2 Cufflinks exon 158262739 158262887 1000 + . gene_id "uc008nqh.1"; transcript_id "uc008nqh.1"; exon_number "8"; FPKM "5.0001206267"; frac "1.000000"; conf_lo "3.934091"; conf_hi "6.066150"; cov "0.284007";
+chr2 Cufflinks exon 178152211 178152296 1000 + . gene_id "uc008ohq.1"; transcript_id "uc008ohq.1"; exon_number "3"; FPKM "1.6796903776"; frac "1.000000"; conf_lo "0.491970"; conf_hi "2.867411"; cov "0.095406";
+chr3 Cufflinks exon 97500913 97501137 1000 - . gene_id "uc008qpe.1"; transcript_id "uc008qpe.1"; exon_number "7"; FPKM "4.1738869883"; frac "0.398377"; conf_lo "3.671923"; conf_hi "4.675851"; cov "0.237077";
+chr3 Cufflinks exon 101987874 101987902 1000 - . gene_id "uc008qrt.1"; transcript_id "uc008qrt.1"; exon_number "4"; FPKM "0.6428024028"; frac "1.000000"; conf_lo "0.000000"; conf_hi "1.551862"; cov "0.036511";
+chr3 Cufflinks exon 127258214 127258303 1000 + . gene_id "uc008rhf.1"; transcript_id "uc008rhf.1"; exon_number "2"; FPKM "0.4060755145"; frac "0.353557"; conf_lo "0.120085"; conf_hi "0.692066"; cov "0.023065";
+chr3 Cufflinks exon 144790795 144790854 1000 + . gene_id "uc008rqi.1"; transcript_id "uc008rqi.1"; exon_number "5"; FPKM "1.1258808773"; frac "0.289434"; conf_lo "0.699104"; conf_hi "1.552658"; cov "0.063950";
+chr4 Cufflinks exon 17978869 17981846 1000 + . gene_id "uc008sbv.1"; transcript_id "uc008sbv.1"; exon_number "5"; FPKM "0.6623587694"; frac "0.585087"; conf_lo "0.270053"; conf_hi "1.054665"; cov "0.037622";
+chr4 Cufflinks exon 21711840 21711940 1000 + . gene_id "uc008scz.1"; transcript_id "uc008scz.1"; exon_number "5"; FPKM "0.9584930367"; frac "0.150841"; conf_lo "0.742054"; conf_hi "1.174932"; cov "0.054442";
+chr4 Cufflinks exon 108353507 108353731 1000 - . gene_id "uc008ubn.1"; transcript_id "uc008ubn.1"; exon_number "9"; FPKM "2.6286767383"; frac "1.000000"; conf_lo "1.111010"; conf_hi "4.146344"; cov "0.149309";
+chr4 Cufflinks exon 131325668 131325803 1000 - . gene_id "uc008vab.1"; transcript_id "uc008vab.1"; exon_number "2"; FPKM "4.0813960015"; frac "1.000000"; conf_lo "2.890730"; conf_hi "5.272062"; cov "0.231823";
+chr4 Cufflinks exon 153530641 153530927 1000 + . gene_id "uc008wbi.1"; transcript_id "uc008wbi.1"; exon_number "12"; FPKM "21.2412511761"; frac "1.000000"; conf_lo "16.217040"; conf_hi "26.265462"; cov "1.206502";
+chr5 Cufflinks exon 3631589 3631765 1000 + . gene_id "uc008whf.1"; transcript_id "uc008whf.1"; exon_number "19"; FPKM "4.6386616700"; frac "0.517324"; conf_lo "3.723563"; conf_hi "5.553760"; cov "0.263476";
+chr5 Cufflinks exon 3992046 3992138 1000 + . gene_id "uc008wid.1"; transcript_id "uc008wid.1"; exon_number "15"; FPKM "23.3742995121"; frac "0.874843"; conf_lo "21.278988"; conf_hi "25.469611"; cov "1.327659";
+chr5 Cufflinks exon 34223636 34223836 1000 + . gene_id "uc008xbk.1"; transcript_id "uc008xbk.1"; exon_number "12"; FPKM "4.1101744570"; frac "0.642098"; conf_lo "2.270677"; conf_hi "5.949672"; cov "0.233458";
+chr5 Cufflinks exon 115734400 115734621 1000 + . gene_id "uc008zdo.1"; transcript_id "uc008zdo.1"; exon_number "3"; FPKM "15.3221708908"; frac "1.000000"; conf_lo "11.506469"; conf_hi "19.137873"; cov "0.870299";
+chr5 Cufflinks exon 137807769 137808016 1000 + . gene_id "uc009aci.1"; transcript_id "uc009aci.1"; exon_number "12"; FPKM "0.7189248975"; frac "0.533440"; conf_lo "0.000000"; conf_hi "1.543846"; cov "0.040835";
+chr6 Cufflinks transcript 17015149 17055825 1000 + . gene_id "uc009azi.1"; transcript_id "uc009azi.1"; FPKM "12.3429992456"; frac "1.000000"; conf_lo "9.242902"; conf_hi "15.443097"; cov "0.701082";
+chr6 Cufflinks exon 15361026 15361102 1000 + . gene_id "uc009ayz.1"; transcript_id "uc009ayz.1"; exon_number "14"; FPKM "4.1692596952"; frac "0.281345"; conf_lo "2.894471"; conf_hi "5.444049"; cov "0.236814";
+chr6 Cufflinks exon 115576309 115576426 1000 - . gene_id "uc009dix.1"; transcript_id "uc009dix.1"; exon_number "8"; FPKM "34.7320589881"; frac "0.628311"; conf_lo "31.195284"; conf_hi "38.268834"; cov "1.972780";
+chr6 Cufflinks exon 117820274 117822784 1000 + . gene_id "uc009dld.1"; transcript_id "uc009dld.1"; exon_number "3"; FPKM "8.2141924772"; frac "1.000000"; conf_lo "5.778655"; conf_hi "10.649730"; cov "0.466566";
+chr6 Cufflinks exon 121331667 121331759 1000 - . gene_id "uc009don.1"; transcript_id "uc009don.1"; exon_number "4"; FPKM "0.9373248338"; frac "0.255959"; conf_lo "0.597786"; conf_hi "1.276864"; cov "0.053240";
+chr6 Cufflinks exon 134837648 134837803 1000 - . gene_id "uc009ekw.1"; transcript_id "uc009ekw.1"; exon_number "2"; FPKM "3.4342255434"; frac "0.337007"; conf_lo "2.099159"; conf_hi "4.769292"; cov "0.195064";
+chr7 Cufflinks exon 19628774 19628924 1000 + . gene_id "uc009fkg.1"; transcript_id "uc009fkg.1"; exon_number "14"; FPKM "3.4380795645"; frac "0.240903"; conf_lo "2.901335"; conf_hi "3.974824"; cov "0.195283";
+chr7 Cufflinks transcript 51739887 51740783 1000 + . gene_id "uc009gpo.1"; transcript_id "uc009gpo.1"; FPKM "3.5875651083"; frac "1.000000"; conf_lo "0.658330"; conf_hi "6.516800"; cov "0.203774";
+chr7 Cufflinks exon 53085965 53086159 1000 - . gene_id "uc009gxj.1"; transcript_id "uc009gxj.1"; exon_number "6"; FPKM "6.4200658663"; frac "0.543693"; conf_lo "4.666748"; conf_hi "8.173383"; cov "0.364660";
+chr7 Cufflinks exon 77546982 77547077 1000 + . gene_id "uc009hnk.1"; transcript_id "uc009hnk.1"; exon_number "7"; FPKM "0.4622078998"; frac "0.405823"; conf_lo "0.077413"; conf_hi "0.847003"; cov "0.026253";
+chr7 Cufflinks exon 82788205 82788350 1000 + . gene_id "uc009hwu.1"; transcript_id "uc009hwu.1"; exon_number "1"; FPKM "0.6859341657"; frac "0.576962"; conf_lo "0.055268"; conf_hi "1.316600"; cov "0.038961";
+chr7 Cufflinks exon 85984891 85985078 1000 - . gene_id "uc009hxo.1"; transcript_id "uc009hxo.1"; exon_number "3"; FPKM "3.0017741434"; frac "1.000000"; conf_lo "1.397258"; conf_hi "4.606290"; cov "0.170501";
+chr7 Cufflinks exon 148509981 148510078 1000 - . gene_id "uc009kkn.1"; transcript_id "uc009kkn.1"; exon_number "11"; FPKM "32.9197864125"; frac "1.000000"; conf_lo "27.375094"; conf_hi "38.464479"; cov "1.869843";
+chr9 Cufflinks exon 15330072 15330148 1000 + . gene_id "uc009ogc.1"; transcript_id "uc009ogc.1"; exon_number "2"; FPKM "1.0060367764"; frac "1.000000"; conf_lo "0.000000"; conf_hi "2.428788"; cov "0.057143";
+chr9 Cufflinks transcript 21069743 21078812 1000 + . gene_id "uc009okt.1"; transcript_id "uc009okt.1"; FPKM "7.9134805855"; frac "0.623402"; conf_lo "5.930640"; conf_hi "9.896321"; cov "0.449485";
+chr9 Cufflinks exon 57867100 57867303 1000 + . gene_id "uc009pwa.1"; transcript_id "uc009pwa.1"; exon_number "4"; FPKM "0.5359102332"; frac "1.000000"; conf_lo "0.000000"; conf_hi "1.293802"; cov "0.030440";
+chr9 Cufflinks exon 49314958 49315758 1000 - . gene_id "uc009pje.1"; transcript_id "uc009pje.1"; exon_number "2"; FPKM "156.0206032233"; frac "0.793945"; conf_lo "147.369898"; conf_hi "164.671308"; cov "8.861965";
+chr9 Cufflinks exon 106815438 106815604 1000 - . gene_id "uc009rkv.1"; transcript_id "uc009rkv.1"; exon_number "12"; FPKM "5.4023275754"; frac "1.000000"; conf_lo "4.337713"; conf_hi "6.466942"; cov "0.306852";
+chr9 Cufflinks exon 119703054 119703292 1000 - . gene_id "uc009sbk.1"; transcript_id "uc009sbk.1"; exon_number "15"; FPKM "0.0814657030"; frac "1.000000"; conf_lo "0.000000"; conf_hi "0.244397"; cov "0.004627";
\ No newline at end of file
diff -r 3445ca17a4c5 -r 1b30f5fa152b test-data/gtf2bedgraph_out.bedgraph
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/gtf2bedgraph_out.bedgraph Thu Apr 22 16:05:08 2010 -0400
@@ -0,0 +1,101 @@
+chr1 36425949 36426026 4.8386844109
+chr1 46891971 46892996 8.4688567539
+chr1 71654477 71654594 0.4686878995
+chr1 72629844 72679706 4.0695297327
+chr1 75531752 75532000 3.6392661141
+chr1 123389481 123389564 0.9948773061
+chr1 129625989 129626119 0.0003267777
+chr1 132059396 132059512 0.2051423010
+chr1 175865140 175865308 0.6544444010
+chr10 7399379 7400956 2.1099978681
+chr10 79784825 79784954 1.2054582676
+chr10 79820728 79820836 1.8177911161
+chr10 105907394 106369573 4.2493607936
+chr10 119487060 119487172 4.3105966126
+chr11 29097092 29097209 4.2530782301
+chr11 69404157 69404264 18.7450971965
+chr11 98249985 98261804 2.1571271227
+chr11 102210140 102211681 0.8688186006
+chr11 105926399 105927243 3.6706747247
+chr11 106633965 106634066 2.4729108195
+chr11 120472426 120472492 10.2380258574
+chr12 100112716 100112852 1.8669402513
+chr13 8889563 8891614 9.4402522582
+chr13 13756206 13756380 0.0574771218
+chr13 93243917 93244083 6.9802111138
+chr14 13130095 13130170 4.0381928600
+chr14 32036105 32036250 0.1289615781
+chr14 56517079 56517223 15.7683764379
+chr14 62950941 62951013 10.1138803585
+chr14 66479006 66479052 14.3011267395
+chr14 70961618 70961783 2.0814553995
+chr14 96679221 96679434 1.7614342028
+chr14 99504387 99504488 3.1573312214
+chr15 12777807 12777962 12.7118803258
+chr15 28200048 28200282 0.0608801712
+chr15 34434713 34434889 2.1698982510
+chr15 51709055 51716160 5.0213279245
+chr15 54880181 54880296 9.7267082384
+chr15 59176892 59177072 4.5392702144
+chr15 76426649 76426779 3.5730073595
+chr15 76533503 76533613 3.3395072810
+chr15 88963182 88963261 1.5871531781
+chr15 102313590 102313719 37.5792165910
+chr15 102455469 102455519 0.2873090379
+chr16 3979122 3982204 4.1992546925
+chr16 4608597 4608818 5.7793602049
+chr16 20541819 20541939 68.0268643583
+chr17 24857053 24858867 22.0141466642
+chr17 25379603 25380686 1.7458387165
+chr17 27159195 27159462 1.7334774900
+chr18 34787706 34787836 5.0638001964
+chr18 61167369 61167501 2.4172869897
+chr18 61371051 61371250 0.1230526474
+chr18 86630591 86630719 2.2892787327
+chr19 5603633 5603715 2.1837193523
+chr2 28404474 28404676 10.9087431164
+chr2 29770253 29770439 7.2973656902
+chr2 30002171 30002382 12.8769808138
+chr2 32076599 32076704 43.8860660433
+chr2 32546709 32546774 8.1366623064
+chr2 35574279 35574458 2.0012109810
+chr2 117127696 117127757 1.6760710643
+chr2 122435404 122435623 10.5679023498
+chr2 130265171 130265261 3.6318426438
+chr2 152702302 152702428 2.5312142816
+chr2 158262738 158262887 5.0001206267
+chr2 178152210 178152296 1.6796903776
+chr3 97500912 97501137 4.1738869883
+chr3 101987873 101987902 0.6428024028
+chr3 127258213 127258303 0.4060755145
+chr3 144790794 144790854 1.1258808773
+chr4 17978868 17981846 0.6623587694
+chr4 21711839 21711940 0.9584930367
+chr4 108353506 108353731 2.6286767383
+chr4 131325667 131325803 4.0813960015
+chr4 153530640 153530927 21.2412511761
+chr5 3631588 3631765 4.6386616700
+chr5 3992045 3992138 23.3742995121
+chr5 34223635 34223836 4.1101744570
+chr5 115734399 115734621 15.3221708908
+chr5 137807768 137808016 0.7189248975
+chr6 15361025 15361102 4.1692596952
+chr6 17015148 17055825 12.3429992456
+chr6 115576308 115576426 34.7320589881
+chr6 117820273 117822784 8.2141924772
+chr6 121331666 121331759 0.9373248338
+chr6 134837647 134837803 3.4342255434
+chr7 19628773 19628924 3.4380795645
+chr7 51739886 51740783 3.5875651083
+chr7 53085964 53086159 6.4200658663
+chr7 77546981 77547077 0.4622078998
+chr7 82788204 82788350 0.6859341657
+chr7 85984890 85985078 3.0017741434
+chr7 148509980 148510078 32.9197864125
+chr9 15330071 15330148 1.0060367764
+chr9 21069742 21078812 7.9134805855
+chr9 49314957 49315758 156.0206032233
+chr9 57867099 57867303 0.5359102332
+chr9 106815437 106815604 5.4023275754
+chr9 119703053 119703292 0.0814657030
+track type=bedGraph
diff -r 3445ca17a4c5 -r 1b30f5fa152b tool_conf.xml.sample
--- a/tool_conf.xml.sample Thu Apr 22 09:35:44 2010 -0400
+++ b/tool_conf.xml.sample Thu Apr 22 16:05:08 2010 -0400
@@ -79,6 +79,7 @@
<tool file="fastx_toolkit/fastq_to_fasta.xml" />
<tool file="filters/wiggle_to_simple.xml" />
<tool file="filters/sff_extractor.xml" />
+ <tool file="filters/gtf2bedgraph.xml" />
</section>
<section name="Extract Features" id="features">
<tool file="filters/ucsc_gene_bed_to_exon_bed.xml" />
diff -r 3445ca17a4c5 -r 1b30f5fa152b tools/filters/gtf2bedgraph.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/filters/gtf2bedgraph.xml Thu Apr 22 16:05:08 2010 -0400
@@ -0,0 +1,79 @@
+<tool id="gtf2bedgraph" name="GTF-to-BEDGraph">
+ <description>converter</description>
+ <command interpreter="python">gtf_to_bedgraph_converter.py $input $out_file1 $attribute_name</command>
+ <inputs>
+ <param format="gtf" name="input" type="data" label="Convert this query"/>
+ <param name="attribute_name" type="text" label="Attribute to Use for Value"/>
+ </inputs>
+ <outputs>
+ <data format="bedgraph" name="out_file1" />
+ </outputs>
+ <tests>
+ <test>
+ <param name="input" value="gtf2bedgraph_in.gtf" ftype="gtf"/>
+ <param name="attribute_name" value="FPKM"/>
+ <output name="out_file1" file="gtf2bedgraph_out.bedgraph" ftype="bedgraph"/>
+ </test>
+ </tests>
+ <help>
+
+**What it does**
+
+This tool converts data from GTF format to BEDGraph format (scroll down for format description).
+
+--------
+
+**Example**
+
+The following data in GFF format::
+
+ chr22 GeneA enhancer 10000000 10001000 500 + . gene_id "GeneA"; transcript_id "TranscriptAlpha"; FPKM "2.75"; frac "1.000000";
+ chr22 GeneA promoter 10010000 10010100 900 + . gene_id "GeneA"; transcript_id "TranscriptsAlpha"; FPKM "2.25"; frac "1.000000";
+
+using the attribute name 'FPKM' will be converted to BEDGraph (**note** that 1 is subtracted from the start coordinate)::
+
+
+ chr22 9999999 10001000 2.75
+ chr22 10009999 10010100 2.25
+
+------
+
+.. class:: infomark
+
+**About formats**
+
+**GTF format** Gene Transfer Format is a format for describing genes and other features associated with DNA, RNA and Protein sequences. GTF lines have nine tab-separated fields::
+
+ 1. seqname - Must be a chromosome or scaffold.
+ 2. source - The program that generated this feature.
+ 3. feature - The name of this type of feature. Some examples of standard feature types are "CDS", "start_codon", "stop_codon", and "exon".
+ 4. start - The starting position of the feature in the sequence. The first base is numbered 1.
+ 5. end - The ending position of the feature (inclusive).
+ 6. score - A score between 0 and 1000. If there is no score value, enter ".".
+ 7. strand - Valid entries include '+', '-', or '.' (for don't know/care).
+ 8. frame - If the feature is a coding exon, frame should be a number between 0-2 that represents the reading frame of the first base. If the feature is not a coding exon, the value should be '.'.
+ 9. group - The group field is a list of attributes. Each attribute consists of a type/value pair. Attributes must end in a semi-colon, and be separated from any following attribute by exactly one space. The attribute list must begin with the two mandatory attributes: (i) gene_id value - A globally unique identifier for the genomic source of the sequence and (ii) transcript_id value - A globally unique identifier for the predicted transcript.
+
+**BEDGraph format**
+
+The bedGraph format is line-oriented. Bedgraph data are preceeded by a track definition line, which adds a number of options for controlling the default display of this track.
+
+For the track definition line, all options are placed in a single line separated by spaces:
+ track type=bedGraph name=track_label description=center_label
+ visibility=display_mode color=r,g,b altColor=r,g,b
+ priority=priority autoScale=on|off alwaysZero=on|off
+ gridDefault=on|off maxHeightPixels=max:default:min
+ graphType=bar|points viewLimits=lower:upper
+ yLineMark=real-value yLineOnOff=on|off
+ windowingFunction=maximum|mean|minimum smoothingWindow=off|2-16
+
+The track type is REQUIRED, and must be bedGraph:
+ type=bedGraph
+
+Following the track definition line are the track data in four column BED format::
+
+ chromA chromStartA chromEndA dataValueA
+ chromB chromStartB chromEndB dataValueB
+
+</help>
+</tool>
diff -r 3445ca17a4c5 -r 1b30f5fa152b tools/filters/gtf_to_bedgraph_converter.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/filters/gtf_to_bedgraph_converter.py Thu Apr 22 16:05:08 2010 -0400
@@ -0,0 +1,73 @@
+#!/usr/bin/env python
+import os, sys, tempfile
+
+assert sys.version_info[:2] >= ( 2, 4 )
+
+def __main__():
+ # Read parms.
+ input_name = sys.argv[1]
+ output_name = sys.argv[2]
+ attribute_name = sys.argv[3]
+
+ # Create temp file.
+ tmp_name = tempfile.NamedTemporaryFile().name
+
+ # Do conversion.
+ skipped_lines = 0
+ first_skipped_line = 0
+ out = open( tmp_name, 'w' )
+
+ # Write track definition line.
+ out.write( "track type=bedGraph\n")
+
+ # Write track data to temporary file.
+ i = 0
+ for i, line in enumerate( file( input_name ) ):
+ line = line.rstrip( '\r\n' )
+
+ if line and not line.startswith( '#' ):
+ try:
+ elems = line.split( '\t' )
+ start = str( int( elems[3] ) - 1 ) # GTF coordinates are 1-based, BedGraph are 0-based.
+ strand = elems[6]
+ if strand not in ['+', '-']:
+ strand = '+'
+ attributes_list = elems[8].split(";")
+ attributes = {}
+ for name_value_pair in attributes_list:
+ pair = name_value_pair.strip().split(" ")
+ name = pair[0].strip()
+ if name == '':
+ continue
+ # Need to strip double quote from values
+ value = pair[1].strip(" \"")
+ attributes[name] = value
+ value = attributes[ attribute_name ]
+ # GTF format: chrom source, name, chromStart, chromEnd, score, strand, frame, attributes.
+ # BedGraph format: chrom, chromStart, chromEnd, value
+ out.write( "%s\t%s\t%s\t%s\n" %( elems[0], start, elems[4], value ) )
+ except:
+ skipped_lines += 1
+ if not first_skipped_line:
+ first_skipped_line = i + 1
+ else:
+ skipped_lines += 1
+ if not first_skipped_line:
+ first_skipped_line = i + 1
+ out.close()
+
+ # Sort tmp file to create bedgraph file; sort by chromosome name and chromosome start.
+ cmd = "sort -k1,1 -k2,2n < %s > %s" % ( tmp_name, output_name )
+ try:
+ os.system(cmd)
+ os.remove(tmp_name)
+ except Exception, ex:
+ sys.stderr.write( "%s\n" % ex )
+ sys.exit(1)
+
+ info_msg = "%i lines converted to BEDGraph. " % ( i + 1 - skipped_lines )
+ if skipped_lines > 0:
+ info_msg += "Skipped %d blank/comment/invalid lines starting with line #%d." %( skipped_lines, first_skipped_line )
+ print info_msg
+
+if __name__ == "__main__": __main__()
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/3445ca17a4c5
changeset: 3680:3445ca17a4c5
user: rc
date: Thu Apr 22 09:35:44 2010 -0400
description:
lims:
dataset rename issue fixed
diffstat:
scripts/galaxy_messaging/server/data_transfer.py | 2 +-
1 files changed, 1 insertions(+), 1 deletions(-)
diffs (12 lines):
diff -r 155f2e89a02b -r 3445ca17a4c5 scripts/galaxy_messaging/server/data_transfer.py
--- a/scripts/galaxy_messaging/server/data_transfer.py Wed Apr 21 18:35:59 2010 -0400
+++ b/scripts/galaxy_messaging/server/data_transfer.py Thu Apr 22 09:35:44 2010 -0400
@@ -161,7 +161,7 @@
pexpect.TIMEOUT:print_ticks},
timeout=10)
log.debug(output)
- path = os.path.join(self.server_dir, os.path.basename(df['file']))
+ path = os.path.join(self.server_dir, os.path.basename(df['name']))
if not os.path.exists(path):
msg = 'Could not find the local file after transfer (%s)' % path
log.error(msg)
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/155f2e89a02b
changeset: 3679:155f2e89a02b
user: Kanwei Li <kanwei(a)gmail.com>
date: Wed Apr 21 18:35:59 2010 -0400
description:
Fix various trackster issues
diffstat:
lib/galaxy/datatypes/converters/bam_to_summary_tree_converter.py | 2 +-
lib/galaxy/datatypes/converters/bedgraph_to_array_tree_converter.py | 2 +-
lib/galaxy/model/__init__.py | 1 -
lib/galaxy/visualization/tracks/data/interval_index.py | 3 +-
lib/galaxy/visualization/tracks/data/summary_tree.py | 10 ++-
static/scripts/galaxy.base.js | 2 +-
static/scripts/packed/galaxy.base.js | 2 +-
static/scripts/packed/trackster.js | 2 +-
static/scripts/trackster.js | 27 +++++++--
static/trackster.css | 2 +-
templates/tracks/browser.mako | 6 +-
templates/tracks/new_browser.mako | 2 +-
templates/user/dbkeys.mako | 2 +-
13 files changed, 40 insertions(+), 23 deletions(-)
diffs (236 lines):
diff -r fe14a58568ad -r 155f2e89a02b lib/galaxy/datatypes/converters/bam_to_summary_tree_converter.py
--- a/lib/galaxy/datatypes/converters/bam_to_summary_tree_converter.py Wed Apr 21 16:42:09 2010 -0400
+++ b/lib/galaxy/datatypes/converters/bam_to_summary_tree_converter.py Wed Apr 21 18:35:59 2010 -0400
@@ -19,7 +19,7 @@
st = SummaryTree(block_size=100, levels=4, draw_cutoff=100, detail_cutoff=20)
for read in bamfile.fetch():
- st.insert_range(read.rname, read.mpos, read.pos + read.rlen)
+ st.insert_range(bamfile.getrname(read.rname), read.pos, read.pos + read.rlen)
st.write(out_fname)
diff -r fe14a58568ad -r 155f2e89a02b lib/galaxy/datatypes/converters/bedgraph_to_array_tree_converter.py
--- a/lib/galaxy/datatypes/converters/bedgraph_to_array_tree_converter.py Wed Apr 21 16:42:09 2010 -0400
+++ b/lib/galaxy/datatypes/converters/bedgraph_to_array_tree_converter.py Wed Apr 21 18:35:59 2010 -0400
@@ -34,7 +34,7 @@
chrom = feature[0]
chrom_start = int(feature[1])
chrom_end = int(feature[2])
- score = int(feature[3])
+ score = float(feature[3])
return chrom, chrom_start, chrom_end, None, score
def main():
diff -r fe14a58568ad -r 155f2e89a02b lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py Wed Apr 21 16:42:09 2010 -0400
+++ b/lib/galaxy/model/__init__.py Wed Apr 21 18:35:59 2010 -0400
@@ -611,7 +611,6 @@
if fail_dependencies:
return None
except ValueError:
- log.debug("WTF")
raise ValueError("A dependency could not be converted.")
except KeyError:
pass # No deps
diff -r fe14a58568ad -r 155f2e89a02b lib/galaxy/visualization/tracks/data/interval_index.py
--- a/lib/galaxy/visualization/tracks/data/interval_index.py Wed Apr 21 16:42:09 2010 -0400
+++ b/lib/galaxy/visualization/tracks/data/interval_index.py Wed Apr 21 18:35:59 2010 -0400
@@ -26,7 +26,8 @@
payload = [ offset, start, end ]
if "no_detail" not in kwargs:
length = len(feature)
- payload.append(feature[3]) # name
+ if length >= 4:
+ payload.append(feature[3]) # name
if length >= 6: # strand
payload.append(feature[5])
diff -r fe14a58568ad -r 155f2e89a02b lib/galaxy/visualization/tracks/data/summary_tree.py
--- a/lib/galaxy/visualization/tracks/data/summary_tree.py Wed Apr 21 16:42:09 2010 -0400
+++ b/lib/galaxy/visualization/tracks/data/summary_tree.py Wed Apr 21 18:35:59 2010 -0400
@@ -19,17 +19,23 @@
if st is None:
st = summary_tree_from_file( self.dataset.file_name )
CACHE[filename] = st
- if chrom not in st.chrom_blocks:
+ if chrom in st.chrom_blocks:
+ pass
+ elif chrom[3:] in st.chrom_blocks:
+ chrom = chrom[3:]
+ else:
return None
resolution = max(1, ceil(float(kwargs['resolution'])))
level = ceil( log( resolution, st.block_size ) )
level = int(max( level, 0 ))
+ if level <= 0:
+ return None
stats = st.chrom_stats[chrom]
results = st.query(chrom, int(start), int(end), level)
- if results == "detail" or level <= 0:
+ if results == "detail":
return None
elif results == "draw":
return "no_detail", None, None
diff -r fe14a58568ad -r 155f2e89a02b static/scripts/galaxy.base.js
--- a/static/scripts/galaxy.base.js Wed Apr 21 16:42:09 2010 -0400
+++ b/static/scripts/galaxy.base.js Wed Apr 21 18:35:59 2010 -0400
@@ -341,7 +341,7 @@
var id = this.id;
var body = $(this).children( "div.historyItemBody" );
var peek = body.find( "pre.peek" )
- $(this).find( ".historyItemTitleBar > .historyItemTitle" ).wrap( "<a href='javascript:void();'></a>" ).click( function() {
+ $(this).find( ".historyItemTitleBar > .historyItemTitle" ).wrap( "<a href='javascript:void(0);'></a>" ).click( function() {
if ( body.is(":visible") ) {
// Hiding stuff here
if ( $.browser.mozilla ) { peek.css( "overflow", "hidden" ); }
diff -r fe14a58568ad -r 155f2e89a02b static/scripts/packed/galaxy.base.js
--- a/static/scripts/packed/galaxy.base.js Wed Apr 21 16:42:09 2010 -0400
+++ b/static/scripts/packed/galaxy.base.js Wed Apr 21 18:35:59 2010 -0400
@@ -1,1 +1,1 @@
-$(document).ready(function(){replace_big_select_inputs()});$.fn.makeAbsolute=function(a){return this.each(function(){var b=$(this);var c=b.position();b.css({position:"absolute",marginLeft:0,marginTop:0,top:c.top,left:c.left,right:$(window).width()-(c.left+b.width())});if(a){b.remove().appendTo("body")}})};function ensure_popup_helper(){if($("#popup-helper").length===0){$("<div id='popup-helper'/>").css({background:"white",opacity:0,zIndex:15000,position:"absolute",top:0,left:0,width:"100%",height:"100%"}).appendTo("body").hide()}}function attach_popupmenu(b,d){var a=function(){d.unbind().hide();$("#popup-helper").unbind("click.popupmenu").hide()};var c=function(g){$("#popup-helper").bind("click.popupmenu",a).show();d.click(a).css({left:0,top:-1000}).show();var f=g.pageX-d.width()/2;f=Math.min(f,$(document).scrollLeft()+$(window).width()-$(d).width()-20);f=Math.max(f,$(document).scrollLeft()+20);d.css({top:g.pageY-5,left:f});return false};$(b).click(c)}function make_popupmen!
u(c,b){ensure_popup_helper();var a=$("<ul id='"+c.attr("id")+"-menu'></ul>");$.each(b,function(f,e){if(e){$("<li/>").html(f).click(e).appendTo(a)}else{$("<li class='head'/>").html(f).appendTo(a)}});var d=$("<div class='popmenu-wrapper'>");d.append(a).append("<div class='overlay-border'>").css("position","absolute").appendTo("body").hide();attach_popupmenu(c,d)}function make_popup_menus(){jQuery("div[popupmenu]").each(function(){var c={};$(this).find("a").each(function(){var b=$(this).attr("confirm"),d=$(this).attr("href"),e=$(this).attr("target");c[$(this).text()]=function(){if(!b||confirm(b)){var g=window;if(e=="_parent"){g=window.parent}else{if(e=="_top"){g=window.top}}g.location=d}}});var a=$("#"+$(this).attr("popupmenu"));make_popupmenu(a,c);$(this).remove();a.addClass("popup").show()})}function array_length(b){if(b.length){return b.length}var c=0;for(var a in b){c++}return c}function naturalSort(i,g){var n=/(-?[0-9\.]+)/g,j=i.toString().toLowerCase()||"",f=g.toString()!
.toLowerCase()||"",k=String.fromCharCode(0),l=j.replace(n,k+"$1"+k).sp
lit(k),e=f.replace(n,k+"$1"+k).split(k),d=(new Date(j)).getTime(),m=d?(new Date(f)).getTime():null;if(m){if(d<m){return -1}else{if(d>m){return 1}}}for(var h=0,c=Math.max(l.length,e.length);h<c;h++){oFxNcL=parseFloat(l[h])||l[h];oFyNcL=parseFloat(e[h])||e[h];if(oFxNcL<oFyNcL){return -1}else{if(oFxNcL>oFyNcL){return 1}}}return 0}function replace_big_select_inputs(a){$("select[name=dbkey]").each(function(){var b=$(this);if(a!==undefined&&b.find("option").length<a){return}var c=b.attr("value");var d=$("<input type='text' class='text-and-autocomplete-select'></input>");d.attr("size",40);d.attr("name",b.attr("name"));d.attr("id",b.attr("id"));d.click(function(){var i=$(this).attr("value");$(this).attr("value","Loading...");$(this).showAllInCache();$(this).attr("value",i);$(this).select()});var h=[];var g={};b.children("option").each(function(){var j=$(this).text();var i=$(this).attr("value");if(i=="?"){return}h.push(j);g[j]=i;g[i]=i;if(i==c){d.attr("value",j)}});h.push("unspecifie!
d (?)");g["unspecified (?)"]="?";g["?"]="?";if(d.attr("value")==""){d.attr("value","Click to Search or Select")}h=h.sort(naturalSort);var f={selectFirst:false,autoFill:false,mustMatch:false,matchContains:true,max:1000,minChars:0,hideForLessThanMinChars:false};d.autocomplete(h,f);b.replaceWith(d);var e=function(){var j=d.attr("value");var i=g[j];if(i!==null&&i!==undefined){d.attr("value",i)}else{if(c!=""){d.attr("value",c)}else{d.attr("value","?")}}};d.parents("form").submit(function(){e()});$(document).bind("convert_dbkeys",function(){e()})})}function async_save_text(d,f,e,a,c,h,i,g,b){if(c===undefined){c=30}if(i===undefined){i=4}$("#"+d).live("click",function(){if($("#renaming-active").length>0){return}var l=$("#"+f),k=l.text(),j;if(h){j=$("<textarea></textarea>").attr({rows:i,cols:c}).text(k)}else{j=$("<input type='text'></input>").attr({value:k,size:c})}j.attr("id","renaming-active");j.blur(function(){$(this).remove();l.show();if(b){b(j)}});j.keyup(function(n){if(n.keyCo!
de===27){$(this).trigger("blur")}else{if(n.keyCode===13){var m={};m[a]
=$(this).val();$(this).trigger("blur");$.ajax({url:e,data:m,error:function(){alert("Text editing for elt "+f+" failed")},success:function(o){l.text(o);if(b){b(j)}}})}}});if(g){g(j)}l.hide();j.insertAfter(l);j.focus();j.select();return})}function init_history_items(d,a,c){var b=function(){try{var e=$.jStore.store("history_expand_state");if(e){for(var g in e){$("#"+g+" div.historyItemBody").show()}}}catch(f){$.jStore.remove("history_expand_state")}if($.browser.mozilla){$("div.historyItemBody").each(function(){if(!$(this).is(":visible")){$(this).find("pre.peek").css("overflow","hidden")}})}d.each(function(){var j=this.id;var h=$(this).children("div.historyItemBody");var i=h.find("pre.peek");$(this).find(".historyItemTitleBar > .historyItemTitle").wrap("<a href='javascript:void();'></a>").click(function(){if(h.is(":visible")){if($.browser.mozilla){i.css("overflow","hidden")}h.slideUp("fast");if(!c){var k=$.jStore.store("history_expand_state");if(k){delete k[j];$.jStore.store("hi!
story_expand_state",k)}}}else{h.slideDown("fast",function(){if($.browser.mozilla){i.css("overflow","auto")}});if(!c){var k=$.jStore.store("history_expand_state");if(k===undefined){k={}}k[j]=true;$.jStore.store("history_expand_state",k)}}return false})});$("#top-links > a.toggle").click(function(){var h=$.jStore.store("history_expand_state");if(h===undefined){h={}}$("div.historyItemBody:visible").each(function(){if($.browser.mozilla){$(this).find("pre.peek").css("overflow","hidden")}$(this).slideUp("fast");if(h){delete h[$(this).parent().attr("id")]}});$.jStore.store("history_expand_state",h)}).show()};if(a){b()}else{$.jStore.init("galaxy");$.jStore.engineReady(function(){b()})}}$(document).ready(function(){$("a[confirm]").click(function(){return confirm($(this).attr("confirm"))});if($.fn.tipsy){$(".tooltip").tipsy({gravity:"s"})}make_popup_menus()});
\ No newline at end of file
+$(document).ready(function(){replace_big_select_inputs()});$.fn.makeAbsolute=function(a){return this.each(function(){var b=$(this);var c=b.position();b.css({position:"absolute",marginLeft:0,marginTop:0,top:c.top,left:c.left,right:$(window).width()-(c.left+b.width())});if(a){b.remove().appendTo("body")}})};function ensure_popup_helper(){if($("#popup-helper").length===0){$("<div id='popup-helper'/>").css({background:"white",opacity:0,zIndex:15000,position:"absolute",top:0,left:0,width:"100%",height:"100%"}).appendTo("body").hide()}}function attach_popupmenu(b,d){var a=function(){d.unbind().hide();$("#popup-helper").unbind("click.popupmenu").hide()};var c=function(g){$("#popup-helper").bind("click.popupmenu",a).show();d.click(a).css({left:0,top:-1000}).show();var f=g.pageX-d.width()/2;f=Math.min(f,$(document).scrollLeft()+$(window).width()-$(d).width()-20);f=Math.max(f,$(document).scrollLeft()+20);d.css({top:g.pageY-5,left:f});return false};$(b).click(c)}function make_popupmen!
u(c,b){ensure_popup_helper();var a=$("<ul id='"+c.attr("id")+"-menu'></ul>");$.each(b,function(f,e){if(e){$("<li/>").html(f).click(e).appendTo(a)}else{$("<li class='head'/>").html(f).appendTo(a)}});var d=$("<div class='popmenu-wrapper'>");d.append(a).append("<div class='overlay-border'>").css("position","absolute").appendTo("body").hide();attach_popupmenu(c,d)}function make_popup_menus(){jQuery("div[popupmenu]").each(function(){var c={};$(this).find("a").each(function(){var b=$(this).attr("confirm"),d=$(this).attr("href"),e=$(this).attr("target");c[$(this).text()]=function(){if(!b||confirm(b)){var g=window;if(e=="_parent"){g=window.parent}else{if(e=="_top"){g=window.top}}g.location=d}}});var a=$("#"+$(this).attr("popupmenu"));make_popupmenu(a,c);$(this).remove();a.addClass("popup").show()})}function array_length(b){if(b.length){return b.length}var c=0;for(var a in b){c++}return c}function naturalSort(i,g){var n=/(-?[0-9\.]+)/g,j=i.toString().toLowerCase()||"",f=g.toString()!
.toLowerCase()||"",k=String.fromCharCode(0),l=j.replace(n,k+"$1"+k).sp
lit(k),e=f.replace(n,k+"$1"+k).split(k),d=(new Date(j)).getTime(),m=d?(new Date(f)).getTime():null;if(m){if(d<m){return -1}else{if(d>m){return 1}}}for(var h=0,c=Math.max(l.length,e.length);h<c;h++){oFxNcL=parseFloat(l[h])||l[h];oFyNcL=parseFloat(e[h])||e[h];if(oFxNcL<oFyNcL){return -1}else{if(oFxNcL>oFyNcL){return 1}}}return 0}function replace_big_select_inputs(a){$("select[name=dbkey]").each(function(){var b=$(this);if(a!==undefined&&b.find("option").length<a){return}var c=b.attr("value");var d=$("<input type='text' class='text-and-autocomplete-select'></input>");d.attr("size",40);d.attr("name",b.attr("name"));d.attr("id",b.attr("id"));d.click(function(){var i=$(this).attr("value");$(this).attr("value","Loading...");$(this).showAllInCache();$(this).attr("value",i);$(this).select()});var h=[];var g={};b.children("option").each(function(){var j=$(this).text();var i=$(this).attr("value");if(i=="?"){return}h.push(j);g[j]=i;g[i]=i;if(i==c){d.attr("value",j)}});h.push("unspecifie!
d (?)");g["unspecified (?)"]="?";g["?"]="?";if(d.attr("value")==""){d.attr("value","Click to Search or Select")}h=h.sort(naturalSort);var f={selectFirst:false,autoFill:false,mustMatch:false,matchContains:true,max:1000,minChars:0,hideForLessThanMinChars:false};d.autocomplete(h,f);b.replaceWith(d);var e=function(){var j=d.attr("value");var i=g[j];if(i!==null&&i!==undefined){d.attr("value",i)}else{if(c!=""){d.attr("value",c)}else{d.attr("value","?")}}};d.parents("form").submit(function(){e()});$(document).bind("convert_dbkeys",function(){e()})})}function async_save_text(d,f,e,a,c,h,i,g,b){if(c===undefined){c=30}if(i===undefined){i=4}$("#"+d).live("click",function(){if($("#renaming-active").length>0){return}var l=$("#"+f),k=l.text(),j;if(h){j=$("<textarea></textarea>").attr({rows:i,cols:c}).text(k)}else{j=$("<input type='text'></input>").attr({value:k,size:c})}j.attr("id","renaming-active");j.blur(function(){$(this).remove();l.show();if(b){b(j)}});j.keyup(function(n){if(n.keyCo!
de===27){$(this).trigger("blur")}else{if(n.keyCode===13){var m={};m[a]
=$(this).val();$(this).trigger("blur");$.ajax({url:e,data:m,error:function(){alert("Text editing for elt "+f+" failed")},success:function(o){l.text(o);if(b){b(j)}}})}}});if(g){g(j)}l.hide();j.insertAfter(l);j.focus();j.select();return})}function init_history_items(d,a,c){var b=function(){try{var e=$.jStore.store("history_expand_state");if(e){for(var g in e){$("#"+g+" div.historyItemBody").show()}}}catch(f){$.jStore.remove("history_expand_state")}if($.browser.mozilla){$("div.historyItemBody").each(function(){if(!$(this).is(":visible")){$(this).find("pre.peek").css("overflow","hidden")}})}d.each(function(){var j=this.id;var h=$(this).children("div.historyItemBody");var i=h.find("pre.peek");$(this).find(".historyItemTitleBar > .historyItemTitle").wrap("<a href='javascript:void(0);'></a>").click(function(){if(h.is(":visible")){if($.browser.mozilla){i.css("overflow","hidden")}h.slideUp("fast");if(!c){var k=$.jStore.store("history_expand_state");if(k){delete k[j];$.jStore.store("h!
istory_expand_state",k)}}}else{h.slideDown("fast",function(){if($.browser.mozilla){i.css("overflow","auto")}});if(!c){var k=$.jStore.store("history_expand_state");if(k===undefined){k={}}k[j]=true;$.jStore.store("history_expand_state",k)}}return false})});$("#top-links > a.toggle").click(function(){var h=$.jStore.store("history_expand_state");if(h===undefined){h={}}$("div.historyItemBody:visible").each(function(){if($.browser.mozilla){$(this).find("pre.peek").css("overflow","hidden")}$(this).slideUp("fast");if(h){delete h[$(this).parent().attr("id")]}});$.jStore.store("history_expand_state",h)}).show()};if(a){b()}else{$.jStore.init("galaxy");$.jStore.engineReady(function(){b()})}}$(document).ready(function(){$("a[confirm]").click(function(){return confirm($(this).attr("confirm"))});if($.fn.tipsy){$(".tooltip").tipsy({gravity:"s"})}make_popup_menus()});
\ No newline at end of file
diff -r fe14a58568ad -r 155f2e89a02b static/scripts/packed/trackster.js
--- a/static/scripts/packed/trackster.js Wed Apr 21 16:42:09 2010 -0400
+++ b/static/scripts/packed/trackster.js Wed Apr 21 18:35:59 2010 -0400
@@ -1,1 +1,1 @@
-var DEBUG=false;var DENSITY=1000,FEATURE_LEVELS=10,DATA_ERROR="There was an error in indexing this dataset.",DATA_NOCONVERTER="A converter for this dataset is not installed. Please check your datatypes_conf.xml file.",DATA_NONE="No data for this chrom/contig.",DATA_PENDING="Currently indexing... please wait",DATA_LOADING="Loading data...",CACHED_TILES_FEATURE=10,CACHED_TILES_LINE=30,CACHED_DATA=20,CONTEXT=$("<canvas></canvas>").get(0).getContext("2d"),RIGHT_STRAND,LEFT_STRAND;var right_img=new Image();right_img.src="../images/visualization/strand_right.png";right_img.onload=function(){RIGHT_STRAND=CONTEXT.createPattern(right_img,"repeat")};var left_img=new Image();left_img.src="../images/visualization/strand_left.png";left_img.onload=function(){LEFT_STRAND=CONTEXT.createPattern(left_img,"repeat")};var right_img_inv=new Image();right_img_inv.src="../images/visualization/strand_right_inv.png";right_img_inv.onload=function(){RIGHT_STRAND_INV=CONTEXT.createPattern(right_img_inv!
,"repeat")};var left_img_inv=new Image();left_img_inv.src="../images/visualization/strand_left_inv.png";left_img_inv.onload=function(){LEFT_STRAND_INV=CONTEXT.createPattern(left_img_inv,"repeat")};function commatize(b){b+="";var a=/(\d+)(\d{3})/;while(a.test(b)){b=b.replace(a,"$1,$2")}return b}var Cache=function(a){this.num_elements=a;this.clear()};$.extend(Cache.prototype,{get:function(b){var a=this.key_ary.indexOf(b);if(a!=-1){this.key_ary.splice(a,1);this.key_ary.push(b)}return this.obj_cache[b]},set:function(b,c){if(!this.obj_cache[b]){if(this.key_ary.length>=this.num_elements){var a=this.key_ary.shift();delete this.obj_cache[a]}this.key_ary.push(b)}this.obj_cache[b]=c;return c},clear:function(){this.obj_cache={};this.key_ary=[]}});var Drawer=function(){};$.extend(Drawer.prototype,{intensity:function(b,a,c){},});drawer=new Drawer();var View=function(b,d,c,a){this.vis_id=c;this.dbkey=a;this.title=d;this.chrom=b;this.tracks=[];this.label_tracks=[];this.max_low=0;this.max_!
high=0;this.center=(this.max_high-this.max_low)/2;this.zoom_factor=3;t
his.zoom_level=0;this.track_id_counter=0};$.extend(View.prototype,{add_track:function(a){a.view=this;a.track_id=this.track_id_counter;this.tracks.push(a);if(a.init){a.init()}a.container_div.attr("id","track_"+a.track_id);this.track_id_counter+=1},add_label_track:function(a){a.view=this;this.label_tracks.push(a)},remove_track:function(a){a.container_div.fadeOut("slow",function(){$(this).remove()});delete this.tracks[a]},update_options:function(){var b=$("ul#sortable-ul").sortable("toArray");var d=[];var c=$("#viewport > div").sort(function(g,f){return b.indexOf($(g).attr("id"))>b.indexOf($(f).attr("id"))});$("#viewport > div").remove();$("#viewport").html(c);for(var e in view.tracks){var a=view.tracks[e];if(a.update_options){a.update_options(e)}}},reset:function(){this.low=this.max_low;this.high=this.max_high;this.center=this.center=(this.max_high-this.max_low)/2;this.zoom_level=0;$(".yaxislabel").remove()},redraw:function(f){this.span=this.max_high-this.max_low;var d=this.sp!
an/Math.pow(this.zoom_factor,this.zoom_level),b=this.center-(d/2),e=b+d;if(b<0){b=0;e=b+d}else{if(e>this.max_high){e=this.max_high;b=e-d}}this.low=Math.floor(b);this.high=Math.ceil(e);this.center=Math.round(this.low+(this.high-this.low)/2);this.resolution=Math.pow(10,Math.ceil(Math.log((this.high-this.low)/200)/Math.LN10));this.zoom_res=Math.pow(FEATURE_LEVELS,Math.max(0,Math.ceil(Math.log(this.resolution,FEATURE_LEVELS)/Math.log(FEATURE_LEVELS))));$("#overview-box").css({left:(this.low/this.span)*$("#overview-viewport").width(),width:Math.max(12,((this.high-this.low)/this.span)*$("#overview-viewport").width())}).show();$("#low").val(commatize(this.low));$("#high").val(commatize(this.high));if(!f){for(var c=0,a=this.tracks.length;c<a;c++){if(this.tracks[c].enabled){this.tracks[c].draw()}}for(var c=0,a=this.label_tracks.length;c<a;c++){this.label_tracks[c].draw()}}},zoom_in:function(a,b){if(this.max_high===0||this.high-this.low<30){return}if(a){this.center=a/b.width()*(this.!
high-this.low)+this.low}this.zoom_level+=1;this.redraw()},zoom_out:fun
ction(){if(this.max_high===0){return}if(this.zoom_level<=0){this.zoom_level=0;return}this.zoom_level-=1;this.redraw()}});var Track=function(a,b){this.name=a;this.parent_element=b;this.init_global()};$.extend(Track.prototype,{init_global:function(){this.header_div=$("<div class='track-header'>").text(this.name);this.content_div=$("<div class='track-content'>");this.container_div=$("<div></div>").addClass("track").append(this.header_div).append(this.content_div);this.parent_element.append(this.container_div)},init_each:function(c,b){var a=this;a.enabled=false;a.data_queue={};a.tile_cache.clear();a.data_cache.clear();a.content_div.css("height","30px");a.content_div.text(DATA_LOADING);a.container_div.removeClass("nodata error pending");if(a.view.chrom){$.getJSON(data_url,c,function(d){if(!d||d==="error"){a.container_div.addClass("error");a.content_div.text(DATA_ERROR)}else{if(d==="no converter"){a.container_div.addClass("error");a.content_div.text(DATA_NOCONVERTER)}else{if(d.dat!
a&&d.data.length===0||d.data===null){a.container_div.addClass("nodata");a.content_div.text(DATA_NONE)}else{if(d==="pending"){a.container_div.addClass("pending");a.content_div.text(DATA_PENDING);setTimeout(function(){a.init()},5000)}else{a.content_div.text("");a.content_div.css("height",a.height_px+"px");a.enabled=true;b(d);a.draw()}}}}})}else{a.container_div.addClass("nodata");a.content_div.text(DATA_NONE)}}});var TiledTrack=function(){};$.extend(TiledTrack.prototype,Track.prototype,{draw:function(){var i=this.view.low,e=this.view.high,f=e-i,d=this.view.resolution;if(DEBUG){$("#debug").text(d+" "+this.view.zoom_res)}var k=$("<div style='position: relative;'></div>");this.content_div.children(":first").remove();this.content_div.append(k);var l=this.content_div.width()/f;var h;var a=Math.floor(i/d/DENSITY);while((a*DENSITY*d)<e){var j=this.content_div.width()+"_"+this.view.zoom_level+"_"+a;var c=this.tile_cache.get(j);if(c){var g=a*DENSITY*d;var b=(g-i)*l;if(this.left_offset)!
{b-=this.left_offset}c.css({left:b});k.append(c);this.max_height=Math.
max(this.max_height,c.height())}else{this.delayed_draw(this,j,i,e,a,d,k,l)}a+=1}},delayed_draw:function(c,e,a,f,b,d,g,h){setTimeout(function(){if(!(a>c.view.high||f<c.view.low)){tile_element=c.draw_tile(d,b,g,h);if(tile_element){c.tile_cache.set(e,tile_element);c.max_height=Math.max(c.max_height,tile_element.height());c.content_div.css("height",c.max_height+"px")}}},50)}});var LabelTrack=function(a){Track.call(this,null,a);this.track_type="LabelTrack";this.hidden=true;this.container_div.addClass("label-track")};$.extend(LabelTrack.prototype,Track.prototype,{draw:function(){var c=this.view,d=c.high-c.low,g=Math.floor(Math.pow(10,Math.floor(Math.log(d)/Math.log(10)))),a=Math.floor(c.low/g)*g,e=this.content_div.width(),b=$("<div style='position: relative; height: 1.3em;'></div>");while(a<c.high){var f=(a-c.low)/d*e;b.append($("<div class='label'>"+commatize(a)+"</div>").css({position:"absolute",left:f-1}));a+=g}this.content_div.children(":first").remove();this.content_div.appen!
d(b)}});var LineTrack=function(c,a,b){this.track_type="LineTrack";Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.height_px=100;this.container_div.addClass("line-track");this.dataset_id=a;this.data_cache=new Cache(CACHED_DATA);this.tile_cache=new Cache(CACHED_TILES_LINE);this.prefs={min_value:undefined,max_value:undefined,mode:"Line"};if(b.min_value!==undefined){this.prefs.min_value=b.min_value}if(b.max_value!==undefined){this.prefs.max_value=b.max_value}if(b.mode!==undefined){this.prefs.mode=b.mode}};$.extend(LineTrack.prototype,TiledTrack.prototype,{init:function(){var a=this,b=a.view.tracks.indexOf(a);a.vertical_range=undefined;this.init_each({stats:true,chrom:a.view.chrom,low:null,high:null,dataset_id:a.dataset_id},function(c){data=c.data;if(isNaN(parseFloat(a.prefs.min_value))||isNaN(parseFloat(a.prefs.max_value))){a.prefs.min_value=data.min;a.prefs.max_value=data.max;$("#track_"+b+"_minval").val(a.prefs.min_value);$("#track_"+b+"_maxval").val(a.prefs.max_!
value)}a.vertical_range=a.prefs.max_value-a.prefs.min_value;a.total_fr
equency=data.total_frequency;$("#linetrack_"+b+"_minval").remove();$("#linetrack_"+b+"_maxval").remove();var e=$("<div></div>").addClass("yaxislabel").attr("id","linetrack_"+b+"_minval").text(a.prefs.min_value);var d=$("<div></div>").addClass("yaxislabel").attr("id","linetrack_"+b+"_maxval").text(a.prefs.max_value);d.css({position:"relative",top:"25px"});d.prependTo(a.container_div);e.css({position:"relative",top:a.height_px+55+"px"});e.prependTo(a.container_div)})},get_data:function(d,b){var c=this,a=b*DENSITY*d,f=(b+1)*DENSITY*d,e=d+"_"+b;if(!c.data_queue[e]){c.data_queue[e]=true;$.ajax({url:data_url,dataType:"json",data:{chrom:this.view.chrom,low:a,high:f,dataset_id:this.dataset_id,resolution:this.view.resolution},success:function(g){data=g.data;c.data_cache.set(e,data);delete c.data_queue[e];c.draw()},error:function(h,g,i){console.log(h,g,i)}})}},draw_tile:function(o,q,c,e){if(this.vertical_range===undefined){return}var r=q*DENSITY*o,a=DENSITY*o,b=$("<canvas class='tile'!
></canvas>"),u=o+"_"+q;if(this.data_cache.get(u)===undefined){this.get_data(o,q);return}var t=this.data_cache.get(u);if(t===null){return}b.css({position:"absolute",top:0,left:(r-this.view.low)*e});b.get(0).width=Math.ceil(a*e);b.get(0).height=this.height_px;var n=b.get(0).getContext("2d"),k=false,l=this.prefs.min_value,g=this.prefs.max_value,m=this.vertical_range,s=this.total_frequency,d=this.height_px;n.beginPath();if(t.length>1){var f=Math.ceil((t[1][0]-t[0][0])*e)}else{var f=10}for(var p=0;p<t.length;p++){var j=t[p][0]-r;var h=t[p][1];if(this.prefs.mode=="Intensity"){if(h===null){continue}j=j*e;if(h<=l){h=l}else{if(h>=g){h=g}}h=255-Math.floor((h-l)/m*255);n.fillStyle="rgb("+h+","+h+","+h+")";n.fillRect(j,0,f,this.height_px)}else{if(h===null){k=false;continue}else{j=j*e;if(h<=l){h=l}else{if(h>=g){h=g}}h=Math.round(d-(h-l)/m*d);if(k){n.lineTo(j,h)}else{n.moveTo(j,h);k=true}}}}n.stroke();c.append(b);return b},gen_options:function(n){var a=$("<div></div>").addClass("form-row!
");var h="track_"+n+"_minval",k="track_"+n+"_maxval",e="track_"+n+"_mo
de",l=$("<label></label>").attr("for",h).text("Min value:"),b=(this.prefs.min_value===undefined?"":this.prefs.min_value),m=$("<input></input>").attr("id",h).val(b),g=$("<label></label>").attr("for",k).text("Max value:"),j=(this.prefs.max_value===undefined?"":this.prefs.max_value),f=$("<input></input>").attr("id",k).val(j),d=$("<label></label>").attr("for",e).text("Display mode:"),i=(this.prefs.mode===undefined?"Line":this.prefs.mode),c=$('<select id="'+e+'"><option value="Line" id="mode_Line">Line</option><option value="Intensity" id="mode_Intensity">Intensity</option></select>');c.children("#mode_"+i).attr("selected","selected");return a.append(l).append(m).append(g).append(f).append(d).append(c)},update_options:function(d){var a=$("#track_"+d+"_minval").val(),c=$("#track_"+d+"_maxval").val(),b=$("#track_"+d+"_mode option:selected").val();if(a!==this.prefs.min_value||c!==this.prefs.max_value||b!=this.prefs.mode){this.prefs.min_value=parseFloat(a);this.prefs.max_value=parseF!
loat(c);this.prefs.mode=b;this.vertical_range=this.prefs.max_value-this.prefs.min_value;$("#linetrack_"+d+"_minval").text(this.prefs.min_value);$("#linetrack_"+d+"_maxval").text(this.prefs.max_value);this.tile_cache.clear();this.draw()}}});var FeatureTrack=function(c,a,b){this.track_type="FeatureTrack";Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.height_px=100;this.container_div.addClass("feature-track");this.dataset_id=a;this.zo_slots={};this.show_labels_scale=0.001;this.showing_details=false;this.vertical_detail_px=10;this.vertical_nodetail_px=3;this.default_font="9px Monaco, Lucida Console, monospace";this.left_offset=200;this.inc_slots={};this.data_queue={};this.s_e_by_tile={};this.tile_cache=new Cache(CACHED_TILES_FEATURE);this.data_cache=new Cache(20);this.prefs={block_color:"black",label_color:"black",show_counts:false};if(b.block_color!==undefined){this.prefs.block_color=b.block_color}if(b.label_color!==undefined){this.prefs.label_color=b.label_color!
}if(b.show_counts!==undefined){this.prefs.show_counts=b.show_counts}};
$.extend(FeatureTrack.prototype,TiledTrack.prototype,{init:function(){var a=this,b=a.view.max_low+"_"+a.view.max_high;this.init_each({low:a.view.max_low,high:a.view.max_high,dataset_id:a.dataset_id,chrom:a.view.chrom,resolution:this.view.resolution},function(c){a.data_cache.set(b,c);a.draw()})},get_data:function(a,d){var b=this,c=a+"_"+d;if(!b.data_queue[c]){b.data_queue[c]=true;$.getJSON(data_url,{chrom:b.view.chrom,low:a,high:d,dataset_id:b.dataset_id,resolution:this.view.resolution},function(e){b.data_cache.set(c,e);delete b.data_queue[c];b.draw()})}},incremental_slots:function(a,g,c){if(!this.inc_slots[a]){this.inc_slots[a]={};this.inc_slots[a].w_scale=1/a;this.s_e_by_tile[a]={}}var m=this.inc_slots[a].w_scale,u=[],h=0,b=$("<canvas></canvas>").get(0).getContext("2d"),n=this.view.max_low;var d,f,w=[];for(var r=0,s=g.length;r<s;r++){var e=g[r],l=e[0];if(this.inc_slots[a][l]!==undefined){h=Math.max(h,this.inc_slots[a][l]);w.push(this.inc_slots[a][l])}else{u.push(r)}}for(var!
r=0,s=u.length;r<s;r++){var e=g[u[r]];l=e[0],feature_start=e[1],feature_end=e[2],feature_name=e[3];d=Math.floor((feature_start-n)*m);if(!c){d-=b.measureText(feature_name).width}f=Math.ceil((feature_end-n)*m);var q=0;while(true){var o=true;if(this.s_e_by_tile[a][q]!==undefined){for(var p=0,v=this.s_e_by_tile[a][q].length;p<v;p++){var t=this.s_e_by_tile[a][q][p];if(f>t[0]&&d<t[1]){o=false;break}}}if(o){if(this.s_e_by_tile[a][q]===undefined){this.s_e_by_tile[a][q]=[]}this.s_e_by_tile[a][q].push([d,f]);this.inc_slots[a][l]=q;h=Math.max(h,q);break}q++}}return h},draw_tile:function(R,h,m,ae){var z=h*DENSITY*R,X=(h+1)*DENSITY*R,w=DENSITY*R;var ac,ad,p;var Y=z+"_"+X;var ac=this.data_cache.get(Y);if(ac===undefined){this.data_queue[[z,X]]=true;this.get_data(z,X);return}if(ac.dataset_type=="array_tree"){p=30}else{var P=(ac.extra_info==="no_detail");var af=(P?this.vertical_nodetail_px:this.vertical_detail_px);p=this.incremental_slots(this.view.zoom_res,ac.data,P)*af+15;m.parent().css(!
"height",Math.max(this.height_px,p)+"px");ad=this.inc_slots[this.view.
zoom_res]}var a=Math.ceil(w*ae),F=$("<canvas class='tile'></canvas>"),T=this.prefs.label_color,f=this.prefs.block_color,J=this.left_offset;F.css({position:"absolute",top:0,left:(z-this.view.low)*ae-J});F.get(0).width=a+J;F.get(0).height=p;var t=F.get(0).getContext("2d");t.fillStyle=this.prefs.block_color;t.font=this.default_font;t.textAlign="right";var C=55,W=255-C,g=W*2/3;if(ac.dataset_type=="summary_tree"){var L=ac.data;var v=ac.max;var l=ac.avg;if(ac.data.length>2){var b=Math.ceil((L[1][0]-L[0][0])*ae)}else{var b=50}for(var aa=0,s=L.length;aa<s;aa++){var N=Math.ceil((L[aa][0]-z)*ae);var M=L[aa][1];if(!M){continue}var E=Math.floor(W-(M/v)*W);t.fillStyle="rgb("+E+","+E+","+E+")";t.fillRect(N+J,0,b,20);if(this.prefs.show_counts){if(E>g){t.fillStyle="black"}else{t.fillStyle="#ddd"}t.textAlign="center";t.fillText(L[aa][1],N+J+(b/2),12)}}m.append(F);return F}var ac=ac.data;var Z=0;for(var aa=0,s=ac.length;aa<s;aa++){var G=ac[aa],D=G[0],ab=G[1],O=G[2],A=G[3];if(ab<=X&&O>=z){var !
Q=Math.floor(Math.max(0,(ab-z)*ae)),u=Math.ceil(Math.min(a,(O-z)*ae)),K=ad[D]*af;if(P){t.fillRect(Q+J,K+5,u-Q,1)}else{var r=G[4],I=G[5],S=G[6],e=G[7];var q,U,B=null,ag=null;if(I&&S){B=Math.floor(Math.max(0,(I-z)*ae));ag=Math.ceil(Math.min(a,(S-z)*ae))}if(ab>z){t.fillStyle=T;t.fillText(A,Q-1+J,K+8);t.fillStyle=f}if(e){if(r){if(r=="+"){t.fillStyle=RIGHT_STRAND}else{if(r=="-"){t.fillStyle=LEFT_STRAND}}t.fillRect(Q+J,K,u-Q,10);t.fillStyle=f}for(var Y=0,d=e.length;Y<d;Y++){var n=e[Y],c=Math.floor(Math.max(0,(n[0]-z)*ae)),H=Math.ceil(Math.min(a,(n[1]-z)*ae));if(c>H){continue}q=5;U=3;t.fillRect(c+J,K+U,H-c,q);if(B!==undefined&&!(c>ag||H<B)){q=9;U=1;var V=Math.max(c,B),o=Math.min(H,ag);t.fillRect(V+J,K+U,o-V,q)}}}else{q=9;U=1;t.fillRect(Q+J,K+U,u-Q,q);if(G.strand){if(G.strand=="+"){t.fillStyle=RIGHT_STRAND_INV}else{if(G.strand=="-"){t.fillStyle=LEFT_STRAND_INV}}t.fillRect(Q+J,K,u-Q,10);t.fillStyle=prefs.block_color}}}Z++}}m.append(F);return F},gen_options:function(i){var a=$("<div>!
</div>").addClass("form-row");var e="track_"+i+"_block_color",k=$("<la
bel></label>").attr("for",e).text("Block color:"),l=$("<input></input>").attr("id",e).attr("name",e).val(this.prefs.block_color),j="track_"+i+"_label_color",g=$("<label></label>").attr("for",j).text("Text color:"),h=$("<input></input>").attr("id",j).attr("name",j).val(this.prefs.label_color),f="track_"+i+"_show_count",c=$("<label></label>").attr("for",f).text("Show summary counts"),b=$('<input type="checkbox" style="float:left;"></input>').attr("id",f).attr("name",f).attr("checked",this.prefs.show_counts),d=$("<div></div>").append(b).append(c);return a.append(k).append(l).append(g).append(h).append(d)},update_options:function(d){var b=$("#track_"+d+"_block_color").val(),c=$("#track_"+d+"_label_color").val(),a=$("#track_"+d+"_show_count").attr("checked");if(b!==this.prefs.block_color||c!==this.prefs.label_color||a!=this.prefs.show_counts){this.prefs.block_color=b;this.prefs.label_color=c;this.prefs.show_counts=a;this.tile_cache.clear();this.draw()}}});var ReadTrack=function(c!
,a,b){FeatureTrack.call(this,c,a,b);this.track_type="ReadTrack"};$.extend(ReadTrack.prototype,TiledTrack.prototype,FeatureTrack.prototype,{});
\ No newline at end of file
+var DEBUG=false;var DENSITY=1000,FEATURE_LEVELS=10,DATA_ERROR="There was an error in indexing this dataset.",DATA_NOCONVERTER="A converter for this dataset is not installed. Please check your datatypes_conf.xml file.",DATA_NONE="No data for this chrom/contig.",DATA_PENDING="Currently indexing... please wait",DATA_LOADING="Loading data...",CACHED_TILES_FEATURE=10,CACHED_TILES_LINE=30,CACHED_DATA=5,CONTEXT=$("<canvas></canvas>").get(0).getContext("2d"),RIGHT_STRAND,LEFT_STRAND;var right_img=new Image();right_img.src="../images/visualization/strand_right.png";right_img.onload=function(){RIGHT_STRAND=CONTEXT.createPattern(right_img,"repeat")};var left_img=new Image();left_img.src="../images/visualization/strand_left.png";left_img.onload=function(){LEFT_STRAND=CONTEXT.createPattern(left_img,"repeat")};var right_img_inv=new Image();right_img_inv.src="../images/visualization/strand_right_inv.png";right_img_inv.onload=function(){RIGHT_STRAND_INV=CONTEXT.createPattern(right_img_inv,!
"repeat")};var left_img_inv=new Image();left_img_inv.src="../images/visualization/strand_left_inv.png";left_img_inv.onload=function(){LEFT_STRAND_INV=CONTEXT.createPattern(left_img_inv,"repeat")};function commatize(b){b+="";var a=/(\d+)(\d{3})/;while(a.test(b)){b=b.replace(a,"$1,$2")}return b}var Cache=function(a){this.num_elements=a;this.clear()};$.extend(Cache.prototype,{get:function(b){var a=this.key_ary.indexOf(b);if(a!=-1){this.key_ary.splice(a,1);this.key_ary.push(b)}return this.obj_cache[b]},set:function(b,c){if(!this.obj_cache[b]){if(this.key_ary.length>=this.num_elements){var a=this.key_ary.shift();delete this.obj_cache[a]}this.key_ary.push(b)}this.obj_cache[b]=c;return c},clear:function(){this.obj_cache={};this.key_ary=[]}});var Drawer=function(){};$.extend(Drawer.prototype,{intensity:function(b,a,c){},});drawer=new Drawer();var View=function(b,d,c,a){this.vis_id=c;this.dbkey=a;this.title=d;this.chrom=b;this.tracks=[];this.label_tracks=[];this.max_low=0;this.max_h!
igh=0;this.center=(this.max_high-this.max_low)/2;this.zoom_factor=3;th
is.zoom_level=0;this.track_id_counter=0};$.extend(View.prototype,{add_track:function(a){a.view=this;a.track_id=this.track_id_counter;this.tracks.push(a);if(a.init){a.init()}a.container_div.attr("id","track_"+a.track_id);this.track_id_counter+=1},add_label_track:function(a){a.view=this;this.label_tracks.push(a)},remove_track:function(a){a.container_div.fadeOut("slow",function(){$(this).remove()});delete this.tracks[a]},update_options:function(){var b=$("ul#sortable-ul").sortable("toArray");var d=[];var c=$("#viewport > div").sort(function(g,f){return b.indexOf($(g).attr("id"))>b.indexOf($(f).attr("id"))});$("#viewport > div").remove();$("#viewport").html(c);for(var e in view.tracks){var a=view.tracks[e];if(a.update_options){a.update_options(e)}}},reset:function(){this.low=this.max_low;this.high=this.max_high;this.center=this.center=(this.max_high-this.max_low)/2;this.zoom_level=0;$(".yaxislabel").remove()},redraw:function(f){this.span=this.max_high-this.max_low;var d=this.spa!
n/Math.pow(this.zoom_factor,this.zoom_level),b=this.center-(d/2),e=b+d;if(b<0){b=0;e=b+d}else{if(e>this.max_high){e=this.max_high;b=e-d}}this.low=Math.floor(b);this.high=Math.ceil(e);this.center=Math.round(this.low+(this.high-this.low)/2);this.resolution=Math.pow(10,Math.ceil(Math.log((this.high-this.low)/200)/Math.LN10));this.zoom_res=Math.pow(FEATURE_LEVELS,Math.max(0,Math.ceil(Math.log(this.resolution,FEATURE_LEVELS)/Math.log(FEATURE_LEVELS))));$("#overview-box").css({left:(this.low/this.span)*$("#overview-viewport").width(),width:Math.max(12,((this.high-this.low)/this.span)*$("#overview-viewport").width())}).show();$("#low").val(commatize(this.low));$("#high").val(commatize(this.high));if(!f){for(var c=0,a=this.tracks.length;c<a;c++){if(this.tracks[c].enabled){this.tracks[c].draw()}}for(var c=0,a=this.label_tracks.length;c<a;c++){this.label_tracks[c].draw()}}},zoom_in:function(a,b){if(this.max_high===0||this.high-this.low<30){return}if(a){this.center=a/b.width()*(this.h!
igh-this.low)+this.low}this.zoom_level+=1;this.redraw()},zoom_out:func
tion(){if(this.max_high===0){return}if(this.zoom_level<=0){this.zoom_level=0;return}this.zoom_level-=1;this.redraw()}});var Track=function(a,b){this.name=a;this.parent_element=b;this.init_global()};$.extend(Track.prototype,{init_global:function(){this.header_div=$("<div class='track-header'>").text(this.name);this.content_div=$("<div class='track-content'>");this.container_div=$("<div></div>").addClass("track").append(this.header_div).append(this.content_div);this.parent_element.append(this.container_div)},init_each:function(c,b){var a=this;a.enabled=false;a.data_queue={};a.tile_cache.clear();a.data_cache.clear();a.content_div.css("height","30px");if(!a.content_div.text()){a.content_div.text(DATA_LOADING)}a.container_div.removeClass("nodata error pending");if(a.view.chrom){$.getJSON(data_url,c,function(d){if(!d||d==="error"){a.container_div.addClass("error");a.content_div.text(DATA_ERROR)}else{if(d==="no converter"){a.container_div.addClass("error");a.content_div.text(DATA_N!
OCONVERTER)}else{if(d.data&&d.data.length===0||d.data===null){a.container_div.addClass("nodata");a.content_div.text(DATA_NONE)}else{if(d==="pending"){a.container_div.addClass("pending");a.content_div.text(DATA_PENDING);setTimeout(function(){a.init()},5000)}else{a.content_div.text("");a.content_div.css("height",a.height_px+"px");a.enabled=true;b(d);a.draw()}}}}})}else{a.container_div.addClass("nodata");a.content_div.text(DATA_NONE)}}});var TiledTrack=function(){};$.extend(TiledTrack.prototype,Track.prototype,{draw:function(){var i=this.view.low,e=this.view.high,f=e-i,d=this.view.resolution;if(DEBUG){$("#debug").text(d+" "+this.view.zoom_res)}var k=$("<div style='position: relative;'></div>");this.content_div.children(":first").remove();this.content_div.append(k);var l=this.content_div.width()/f;var h;var a=Math.floor(i/d/DENSITY);while((a*DENSITY*d)<e){var j=this.content_div.width()+"_"+this.view.zoom_level+"_"+a;var c=this.tile_cache.get(j);if(c){var g=a*DENSITY*d;var b=(g-!
i)*l;if(this.left_offset){b-=this.left_offset}c.css({left:b});k.append
(c);this.max_height=Math.max(this.max_height,c.height())}else{this.delayed_draw(this,j,i,e,a,d,k,l)}a+=1}},delayed_draw:function(c,e,a,f,b,d,g,h){setTimeout(function(){if(!(a>c.view.high||f<c.view.low)){tile_element=c.draw_tile(d,b,g,h);if(tile_element){c.tile_cache.set(e,tile_element);c.max_height=Math.max(c.max_height,tile_element.height());c.content_div.css("height",c.max_height+"px")}}},50)}});var LabelTrack=function(a){Track.call(this,null,a);this.track_type="LabelTrack";this.hidden=true;this.container_div.addClass("label-track")};$.extend(LabelTrack.prototype,Track.prototype,{draw:function(){var c=this.view,d=c.high-c.low,g=Math.floor(Math.pow(10,Math.floor(Math.log(d)/Math.log(10)))),a=Math.floor(c.low/g)*g,e=this.content_div.width(),b=$("<div style='position: relative; height: 1.3em;'></div>");while(a<c.high){var f=(a-c.low)/d*e;b.append($("<div class='label'>"+commatize(a)+"</div>").css({position:"absolute",left:f-1}));a+=g}this.content_div.children(":first").remove!
();this.content_div.append(b)}});var LineTrack=function(c,a,b){this.track_type="LineTrack";Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.height_px=100;this.container_div.addClass("line-track");this.dataset_id=a;this.data_cache=new Cache(CACHED_DATA);this.tile_cache=new Cache(CACHED_TILES_LINE);this.prefs={min_value:undefined,max_value:undefined,mode:"Line"};if(b.min_value!==undefined){this.prefs.min_value=b.min_value}if(b.max_value!==undefined){this.prefs.max_value=b.max_value}if(b.mode!==undefined){this.prefs.mode=b.mode}};$.extend(LineTrack.prototype,TiledTrack.prototype,{init:function(){var a=this,b=a.view.tracks.indexOf(a);a.vertical_range=undefined;this.init_each({stats:true,chrom:a.view.chrom,low:null,high:null,dataset_id:a.dataset_id},function(c){data=c.data;if(isNaN(parseFloat(a.prefs.min_value))||isNaN(parseFloat(a.prefs.max_value))){a.prefs.min_value=data.min;a.prefs.max_value=data.max;$("#track_"+b+"_minval").val(a.prefs.min_value);$("#track_"+b+"_!
maxval").val(a.prefs.max_value)}a.vertical_range=a.prefs.max_value-a.p
refs.min_value;a.total_frequency=data.total_frequency;$("#linetrack_"+b+"_minval").remove();$("#linetrack_"+b+"_maxval").remove();var e=$("<div></div>").addClass("yaxislabel").attr("id","linetrack_"+b+"_minval").text(a.prefs.min_value);var d=$("<div></div>").addClass("yaxislabel").attr("id","linetrack_"+b+"_maxval").text(a.prefs.max_value);d.css({position:"relative",top:"25px"});d.prependTo(a.container_div);e.css({position:"relative",top:a.height_px+55+"px"});e.prependTo(a.container_div)})},get_data:function(d,b){var c=this,a=b*DENSITY*d,f=(b+1)*DENSITY*d,e=d+"_"+b;if(!c.data_queue[e]){c.data_queue[e]=true;$.ajax({url:data_url,dataType:"json",data:{chrom:this.view.chrom,low:a,high:f,dataset_id:this.dataset_id,resolution:this.view.resolution},success:function(g){data=g.data;c.data_cache.set(e,data);delete c.data_queue[e];c.draw()},error:function(h,g,i){console.log(h,g,i)}})}},draw_tile:function(o,q,c,e){if(this.vertical_range===undefined){return}var r=q*DENSITY*o,a=DENSITY*o,!
b=$("<canvas class='tile'></canvas>"),u=o+"_"+q;if(this.data_cache.get(u)===undefined){this.get_data(o,q);return}var t=this.data_cache.get(u);if(t===null){return}b.css({position:"absolute",top:0,left:(r-this.view.low)*e});b.get(0).width=Math.ceil(a*e);b.get(0).height=this.height_px;var n=b.get(0).getContext("2d"),k=false,l=this.prefs.min_value,g=this.prefs.max_value,m=this.vertical_range,s=this.total_frequency,d=this.height_px;n.beginPath();if(t.length>1){var f=Math.ceil((t[1][0]-t[0][0])*e)}else{var f=10}for(var p=0;p<t.length;p++){var j=t[p][0]-r;var h=t[p][1];if(this.prefs.mode=="Intensity"){if(h===null){continue}j=j*e;if(h<=l){h=l}else{if(h>=g){h=g}}h=255-Math.floor((h-l)/m*255);n.fillStyle="rgb("+h+","+h+","+h+")";n.fillRect(j,0,f,this.height_px)}else{if(h===null){k=false;continue}else{j=j*e;if(h<=l){h=l}else{if(h>=g){h=g}}h=Math.round(d-(h-l)/m*d);if(k){n.lineTo(j,h)}else{n.moveTo(j,h);k=true}}}}n.stroke();c.append(b);return b},gen_options:function(n){var a=$("<div></!
div>").addClass("form-row");var h="track_"+n+"_minval",k="track_"+n+"_
maxval",e="track_"+n+"_mode",l=$("<label></label>").attr("for",h).text("Min value:"),b=(this.prefs.min_value===undefined?"":this.prefs.min_value),m=$("<input></input>").attr("id",h).val(b),g=$("<label></label>").attr("for",k).text("Max value:"),j=(this.prefs.max_value===undefined?"":this.prefs.max_value),f=$("<input></input>").attr("id",k).val(j),d=$("<label></label>").attr("for",e).text("Display mode:"),i=(this.prefs.mode===undefined?"Line":this.prefs.mode),c=$('<select id="'+e+'"><option value="Line" id="mode_Line">Line</option><option value="Intensity" id="mode_Intensity">Intensity</option></select>');c.children("#mode_"+i).attr("selected","selected");return a.append(l).append(m).append(g).append(f).append(d).append(c)},update_options:function(d){var a=$("#track_"+d+"_minval").val(),c=$("#track_"+d+"_maxval").val(),b=$("#track_"+d+"_mode option:selected").val();if(a!==this.prefs.min_value||c!==this.prefs.max_value||b!=this.prefs.mode){this.prefs.min_value=parseFloat(a);th!
is.prefs.max_value=parseFloat(c);this.prefs.mode=b;this.vertical_range=this.prefs.max_value-this.prefs.min_value;$("#linetrack_"+d+"_minval").text(this.prefs.min_value);$("#linetrack_"+d+"_maxval").text(this.prefs.max_value);this.tile_cache.clear();this.draw()}}});var FeatureTrack=function(c,a,b){this.track_type="FeatureTrack";Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.height_px=100;this.container_div.addClass("feature-track");this.dataset_id=a;this.zo_slots={};this.show_labels_scale=0.001;this.showing_details=false;this.vertical_detail_px=10;this.vertical_nodetail_px=3;this.default_font="9px Monaco, Lucida Console, monospace";this.left_offset=200;this.inc_slots={};this.data_queue={};this.s_e_by_tile={};this.tile_cache=new Cache(CACHED_TILES_FEATURE);this.data_cache=new Cache(20);this.prefs={block_color:"black",label_color:"black",show_counts:false};if(b.block_color!==undefined){this.prefs.block_color=b.block_color}if(b.label_color!==undefined){this.prefs.!
label_color=b.label_color}if(b.show_counts!==undefined){this.prefs.sho
w_counts=b.show_counts}};$.extend(FeatureTrack.prototype,TiledTrack.prototype,{init:function(){var a=this,b=a.view.max_low+"_"+a.view.max_high;this.init_each({low:a.view.max_low,high:a.view.max_high,dataset_id:a.dataset_id,chrom:a.view.chrom,resolution:this.view.resolution},function(c){a.data_cache.set(b,c);a.draw()})},get_data:function(a,d){var b=this,c=a+"_"+d;if(!b.data_queue[c]){b.data_queue[c]=true;$.getJSON(data_url,{chrom:b.view.chrom,low:a,high:d,dataset_id:b.dataset_id,resolution:this.view.resolution},function(e){b.data_cache.set(c,e);delete b.data_queue[c];b.draw()})}},incremental_slots:function(a,g,c){if(!this.inc_slots[a]){this.inc_slots[a]={};this.inc_slots[a].w_scale=1/a;this.s_e_by_tile[a]={}}var m=this.inc_slots[a].w_scale,v=[],h=0,b=$("<canvas></canvas>").get(0).getContext("2d"),n=this.view.max_low;var d,f,x=[];for(var s=0,t=g.length;s<t;s++){var e=g[s],l=e[0];if(this.inc_slots[a][l]!==undefined){h=Math.max(h,this.inc_slots[a][l]);x.push(this.inc_slots[a][l]!
)}else{v.push(s)}}for(var s=0,t=v.length;s<t;s++){var e=g[v[s]];l=e[0],feature_start=e[1],feature_end=e[2],feature_name=e[3];d=Math.floor((feature_start-n)*m);f=Math.ceil((feature_end-n)*m);if(!c){var p=b.measureText(feature_name).width;if(d-p<0){f+=p}else{d-=p}}var r=0;while(true){var o=true;if(this.s_e_by_tile[a][r]!==undefined){for(var q=0,w=this.s_e_by_tile[a][r].length;q<w;q++){var u=this.s_e_by_tile[a][r][q];if(f>u[0]&&d<u[1]){o=false;break}}}if(o){if(this.s_e_by_tile[a][r]===undefined){this.s_e_by_tile[a][r]=[]}this.s_e_by_tile[a][r].push([d,f]);this.inc_slots[a][l]=r;h=Math.max(h,r);break}r++}}return h},draw_tile:function(R,h,m,ae){var z=h*DENSITY*R,X=(h+1)*DENSITY*R,w=DENSITY*R;var ac,ad,p;var Y=z+"_"+X;var ac=this.data_cache.get(Y);if(ac===undefined){this.data_queue[[z,X]]=true;this.get_data(z,X);return}if(ac.dataset_type=="array_tree"){p=30}else{var P=(ac.extra_info==="no_detail");var af=(P?this.vertical_nodetail_px:this.vertical_detail_px);p=this.incremental_slo!
ts(this.view.zoom_res,ac.data,P)*af+15;m.parent().css("height",Math.ma
x(this.height_px,p)+"px");ad=this.inc_slots[this.view.zoom_res]}var a=Math.ceil(w*ae),F=$("<canvas class='tile'></canvas>"),T=this.prefs.label_color,f=this.prefs.block_color,J=this.left_offset;F.css({position:"absolute",top:0,left:(z-this.view.low)*ae-J});F.get(0).width=a+J;F.get(0).height=p;var t=F.get(0).getContext("2d");t.fillStyle=this.prefs.block_color;t.font=this.default_font;t.textAlign="right";var C=55,W=255-C,g=W*2/3;if(ac.dataset_type=="summary_tree"){var L=ac.data;var v=ac.max;var l=ac.avg;if(ac.data.length>2){var b=Math.ceil((L[1][0]-L[0][0])*ae)}else{var b=50}for(var aa=0,s=L.length;aa<s;aa++){var N=Math.ceil((L[aa][0]-z)*ae);var M=L[aa][1];if(!M){continue}var E=Math.floor(W-(M/v)*W);t.fillStyle="rgb("+E+","+E+","+E+")";t.fillRect(N+J,0,b,20);if(this.prefs.show_counts){if(E>g){t.fillStyle="black"}else{t.fillStyle="#ddd"}t.textAlign="center";t.fillText(L[aa][1],N+J+(b/2),12)}}m.append(F);return F}var ac=ac.data;var Z=0;for(var aa=0,s=ac.length;aa<s;aa++){var G=ac!
[aa],D=G[0],ab=G[1],O=G[2],A=G[3];if(ab<=X&&O>=z){var Q=Math.floor(Math.max(0,(ab-z)*ae)),u=Math.ceil(Math.min(a,(O-z)*ae)),K=ad[D]*af;if(P){t.fillRect(Q+J,K+5,u-Q,1)}else{var r=G[4],I=G[5],S=G[6],e=G[7];var q,U,B=null,ag=null;if(I&&S){B=Math.floor(Math.max(0,(I-z)*ae));ag=Math.ceil(Math.min(a,(S-z)*ae))}if(A!==undefined&&ab>z){t.fillStyle=T;if(h===0&&Q-t.measureText(A).width<0){t.textAlign="left";t.fillText(A,u+2+J,K+8)}else{t.textAlign="right";t.fillText(A,Q-2+J,K+8)}t.fillStyle=f}if(e){if(r){if(r=="+"){t.fillStyle=RIGHT_STRAND}else{if(r=="-"){t.fillStyle=LEFT_STRAND}}t.fillRect(Q+J,K,u-Q,10);t.fillStyle=f}for(var Y=0,d=e.length;Y<d;Y++){var n=e[Y],c=Math.floor(Math.max(0,(n[0]-z)*ae)),H=Math.ceil(Math.min(a,(n[1]-z)*ae));if(c>H){continue}q=5;U=3;t.fillRect(c+J,K+U,H-c,q);if(B!==undefined&&!(c>ag||H<B)){q=9;U=1;var V=Math.max(c,B),o=Math.min(H,ag);t.fillRect(V+J,K+U,o-V,q)}}}else{q=9;U=1;t.fillRect(Q+J,K+U,u-Q,q);if(G.strand){if(G.strand=="+"){t.fillStyle=RIGHT_STRAND_INV!
}else{if(G.strand=="-"){t.fillStyle=LEFT_STRAND_INV}}t.fillRect(Q+J,K,
u-Q,10);t.fillStyle=prefs.block_color}}}Z++}}m.append(F);return F},gen_options:function(i){var a=$("<div></div>").addClass("form-row");var e="track_"+i+"_block_color",k=$("<label></label>").attr("for",e).text("Block color:"),l=$("<input></input>").attr("id",e).attr("name",e).val(this.prefs.block_color),j="track_"+i+"_label_color",g=$("<label></label>").attr("for",j).text("Text color:"),h=$("<input></input>").attr("id",j).attr("name",j).val(this.prefs.label_color),f="track_"+i+"_show_count",c=$("<label></label>").attr("for",f).text("Show summary counts"),b=$('<input type="checkbox" style="float:left;"></input>').attr("id",f).attr("name",f).attr("checked",this.prefs.show_counts),d=$("<div></div>").append(b).append(c);return a.append(k).append(l).append(g).append(h).append(d)},update_options:function(d){var b=$("#track_"+d+"_block_color").val(),c=$("#track_"+d+"_label_color").val(),a=$("#track_"+d+"_show_count").attr("checked");if(b!==this.prefs.block_color||c!==this.prefs.labe!
l_color||a!=this.prefs.show_counts){this.prefs.block_color=b;this.prefs.label_color=c;this.prefs.show_counts=a;this.tile_cache.clear();this.draw()}}});var ReadTrack=function(c,a,b){FeatureTrack.call(this,c,a,b);this.track_type="ReadTrack"};$.extend(ReadTrack.prototype,TiledTrack.prototype,FeatureTrack.prototype,{});
\ No newline at end of file
diff -r fe14a58568ad -r 155f2e89a02b static/scripts/trackster.js
--- a/static/scripts/trackster.js Wed Apr 21 16:42:09 2010 -0400
+++ b/static/scripts/trackster.js Wed Apr 21 18:35:59 2010 -0400
@@ -12,7 +12,7 @@
DATA_LOADING = "Loading data...",
CACHED_TILES_FEATURE = 10,
CACHED_TILES_LINE = 30,
- CACHED_DATA = 20,
+ CACHED_DATA = 5,
CONTEXT = $("<canvas></canvas>").get(0).getContext("2d"),
RIGHT_STRAND, LEFT_STRAND;
@@ -227,7 +227,9 @@
track.tile_cache.clear();
track.data_cache.clear();
track.content_div.css( "height", "30px" );
- track.content_div.text(DATA_LOADING);
+ if (!track.content_div.text()) {
+ track.content_div.text(DATA_LOADING);
+ }
track.container_div.removeClass("nodata error pending");
if (track.view.chrom) {
@@ -640,10 +642,16 @@
feature_end = feature[2],
feature_name = feature[3];
f_start = Math.floor( (feature_start - max_low) * w_scale );
+ f_end = Math.ceil( (feature_end - max_low) * w_scale );
+
if (!no_detail) {
- f_start -= dummy_canvas.measureText(feature_name).width;
+ var text_len = dummy_canvas.measureText(feature_name).width;
+ if (f_start - text_len < 0) {
+ f_end += text_len;
+ } else {
+ f_start -= text_len;
+ }
}
- f_end = Math.ceil( (feature_end - max_low) * w_scale );
var j = 0;
// Try to fit the feature to the first slot that doesn't overlap any other features in that slot
@@ -795,9 +803,15 @@
thick_start = Math.floor( Math.max(0, (feature_ts - tile_low) * w_scale) );
thick_end = Math.ceil( Math.min(width, (feature_te - tile_low) * w_scale) );
}
- if (feature_start > tile_low) {
+ if (feature_name !== undefined && feature_start > tile_low) {
ctx.fillStyle = label_color;
- ctx.fillText(feature_name, f_start - 1 + left_offset, y_center + 8);
+ if (tile_index === 0 && f_start - ctx.measureText(feature_name).width < 0) {
+ ctx.textAlign = "left";
+ ctx.fillText(feature_name, f_end + 2 + left_offset, y_center + 8);
+ } else {
+ ctx.textAlign = "right";
+ ctx.fillText(feature_name, f_start - 2 + left_offset, y_center + 8);
+ }
ctx.fillStyle = block_color;
}
if (feature_blocks) {
@@ -851,7 +865,6 @@
j++;
}
}
-
parent_element.append( new_canvas );
return new_canvas;
}, gen_options: function(track_id) {
diff -r fe14a58568ad -r 155f2e89a02b static/trackster.css
--- a/static/trackster.css Wed Apr 21 16:42:09 2010 -0400
+++ b/static/trackster.css Wed Apr 21 18:35:59 2010 -0400
@@ -38,7 +38,7 @@
margin: 0px;
color: white;
margin-top: -6px;
- margin-bottom: -3px;
+ margin-bottom: -4px;
}
#overview-viewport {
diff -r fe14a58568ad -r 155f2e89a02b templates/tracks/browser.mako
--- a/templates/tracks/browser.mako Wed Apr 21 16:42:09 2010 -0400
+++ b/templates/tracks/browser.mako Wed Apr 21 18:35:59 2010 -0400
@@ -34,7 +34,6 @@
<div id="viewport-container" style="overflow-x: hidden; overflow-y: auto;">
<div id="viewport"></div>
</div>
-
</div>
<div id="nav-container" style="width:100%;">
<div id="nav-labeltrack"></div>
@@ -161,8 +160,7 @@
// To adjust the size of the viewport to fit the fixed-height footer
var refresh = function( e ) {
- $("#content").height( $(window).height() - $("#nav-container").height() - $("#masthead").height());
- $("#viewport-container").height( $("#content").height() - $("#top-labeltrack").height() - $("#nav-labeltrack").height() );
+ $("#viewport-container").height( $(window).height() - 120 );
$("#nav-container").width( $("#center").width() );
view.redraw();
};
@@ -176,7 +174,7 @@
this.current_height = e.clientY;
this.current_x = e.offsetX;
}).bind( "drag", function( e ) {
- var container = $(this).parent();
+ var container = $(this);
var delta = e.offsetX - this.current_x;
var new_scroll = container.scrollTop() - (e.clientY - this.current_height);
if ( new_scroll < container.get(0).scrollHeight - container.height() ) {
diff -r fe14a58568ad -r 155f2e89a02b templates/tracks/new_browser.mako
--- a/templates/tracks/new_browser.mako Wed Apr 21 16:42:09 2010 -0400
+++ b/templates/tracks/new_browser.mako Wed Apr 21 18:35:59 2010 -0400
@@ -1,4 +1,4 @@
-<form id="new-browser-form" action="javascript:void();" method="post" onsubmit="return false;">
+<form id="new-browser-form" action="javascript:void(0);" method="post" onsubmit="return false;">
<div class="form-row">
<label for="new-title">Browser name:</label>
<div class="form-row-input">
diff -r fe14a58568ad -r 155f2e89a02b templates/user/dbkeys.mako
--- a/templates/user/dbkeys.mako Wed Apr 21 16:42:09 2010 -0400
+++ b/templates/user/dbkeys.mako Wed Apr 21 18:35:59 2010 -0400
@@ -18,7 +18,7 @@
$(".db_hide").each(function() {
var pre = $(this);
pre.hide();
- pre.siblings("span").wrap( "<a href='javascript:void();'></a>" ).click( function() {
+ pre.siblings("span").wrap( "<a href='javascript:void(0);'></a>" ).click( function() {
pre.toggle();
});
});
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/fe14a58568ad
changeset: 3678:fe14a58568ad
user: rc
date: Wed Apr 21 16:42:09 2010 -0400
description:
lims: fixed data transfer bugs
diffstat:
lib/galaxy/web/controllers/requests_admin.py | 2 +-
scripts/galaxy_messaging/server/amqp_consumer.py | 15 +-
scripts/galaxy_messaging/server/data_transfer.py | 29 ++-
scripts/galaxy_messaging/server/galaxyweb_interface.py | 140 +++++++---------
4 files changed, 90 insertions(+), 96 deletions(-)
diffs (314 lines):
diff -r e600ab3fadc1 -r fe14a58568ad lib/galaxy/web/controllers/requests_admin.py
--- a/lib/galaxy/web/controllers/requests_admin.py Wed Apr 21 11:44:19 2010 -0400
+++ b/lib/galaxy/web/controllers/requests_admin.py Wed Apr 21 16:42:09 2010 -0400
@@ -1680,7 +1680,7 @@
virtual_host=trans.app.config.amqp['virtual_host'],
insist=False)
chan = conn.channel()
- msg = amqp.Message(data,
+ msg = amqp.Message(data.replace('\n', '').replace('\r', ''),
content_type='text/plain',
application_headers={'msg_type': 'data_transfer'})
msg.properties["delivery_mode"] = 2
diff -r e600ab3fadc1 -r fe14a58568ad scripts/galaxy_messaging/server/amqp_consumer.py
--- a/scripts/galaxy_messaging/server/amqp_consumer.py Wed Apr 21 11:44:19 2010 -0400
+++ b/scripts/galaxy_messaging/server/amqp_consumer.py Wed Apr 21 16:42:09 2010 -0400
@@ -37,6 +37,7 @@
log.addHandler(fh)
global dbconnstr
+global config
def get_value(dom, tag_name):
'''
@@ -64,17 +65,20 @@
return rc
def recv_callback(msg):
+ global config
# check the meesage type.
msg_type = msg.properties['application_headers'].get('msg_type')
log.debug('\nMESSAGE RECVD: '+str(msg_type))
if msg_type == 'data_transfer':
log.debug('DATA TRANSFER')
# fork a new process to transfer datasets
- transfer_script = "scripts/galaxy_messaging/server/data_transfer.py"
- cmd = ( "python",
- transfer_script,
- msg.body )
- pid = subprocess.Popen(cmd).pid
+ transfer_script = os.path.join(os.getcwd(),
+ "scripts/galaxy_messaging/server/data_transfer.py")
+ cmd = '%s "%s" "%s" "%s"' % ("python",
+ transfer_script,
+ msg.body,
+ config.get("app:main", "id_secret") )
+ pid = subprocess.Popen(cmd, shell=True).pid
log.debug('Started process (%i): %s' % (pid, str(cmd)))
elif msg_type == 'sample_state_update':
log.debug('SAMPLE STATE UPDATE')
@@ -95,6 +99,7 @@
if len(sys.argv) < 2:
print 'Usage: python amqp_consumer.py <Galaxy config file>'
return
+ global config
config = ConfigParser.ConfigParser()
config.read(sys.argv[1])
global dbconnstr
diff -r e600ab3fadc1 -r fe14a58568ad scripts/galaxy_messaging/server/data_transfer.py
--- a/scripts/galaxy_messaging/server/data_transfer.py Wed Apr 21 11:44:19 2010 -0400
+++ b/scripts/galaxy_messaging/server/data_transfer.py Wed Apr 21 16:42:09 2010 -0400
@@ -8,7 +8,7 @@
Usage:
-python data_transfer.py <data_transfer_xml>
+python data_transfer.py <data_transfer_xml> <config_id_secret>
"""
@@ -57,7 +57,7 @@
class DataTransfer(object):
- def __init__(self, msg):
+ def __init__(self, msg, config_id_secret):
log.info(msg)
self.dom = xml.dom.minidom.parseString(msg)
self.host = self.get_value(self.dom, 'data_host')
@@ -67,6 +67,7 @@
self.library_id = self.get_value(self.dom, 'library_id')
self.folder_id = self.get_value(self.dom, 'folder_id')
self.dataset_files = []
+ self.config_id_secret = config_id_secret
count=0
while True:
index = self.get_value_index(self.dom, 'index', count)
@@ -137,7 +138,7 @@
'''
log.error(traceback.format_exc())
log.error('FATAL ERROR.'+msg)
- self.update_status('Error.', 'All', msg)
+ self.update_status('Error', 'All', msg+"\n"+traceback.format_exc())
sys.exit(1)
def transfer_files(self):
@@ -175,18 +176,24 @@
This method adds the dataset file to the target data library & folder
by opening the corresponding url in Galaxy server running.
'''
- self.update_status(Sample.transfer_status.ADD_TO_LIBRARY)
- galaxyweb = GalaxyWebInterface(self.server_host, self.server_port,
- self.datatx_email, self.datatx_password)
- galaxyweb.add_to_library(self.server_dir, self.library_id, self.folder_id)
- galaxyweb.logout()
-
+ try:
+ self.update_status(Sample.transfer_status.ADD_TO_LIBRARY)
+ log.debug("dir:%s, lib:%s, folder:%s" % (self.server_dir, str(self.library_id), str(self.folder_id)))
+ galaxyweb = GalaxyWebInterface(self.server_host, self.server_port,
+ self.datatx_email, self.datatx_password,
+ self.config_id_secret)
+ galaxyweb.add_to_library(self.server_dir, self.library_id, self.folder_id)
+ galaxyweb.logout()
+ except Exception, e:
+ log.debug(e)
+ self.error_and_exit(str(e))
+
def update_status(self, status, dataset_index='All', msg=''):
'''
Update the data transfer status for this dataset in the database
'''
try:
- log.debug('Setting status "%s" for sample "%s"' % ( status, str(dataset_index) ) )
+ log.debug('Setting status "%s" for dataset "%s"' % ( status, str(dataset_index) ) )
df = from_json_string(self.galaxydb.get_sample_dataset_files(self.sample_id))
if dataset_index == 'All':
for dataset in self.dataset_files:
@@ -240,7 +247,7 @@
#
# Start the daemon
#
- dt = DataTransfer(sys.argv[1])
+ dt = DataTransfer(sys.argv[1], sys.argv[2])
dt.start()
sys.exit(0)
diff -r e600ab3fadc1 -r fe14a58568ad scripts/galaxy_messaging/server/galaxyweb_interface.py
--- a/scripts/galaxy_messaging/server/galaxyweb_interface.py Wed Apr 21 11:44:19 2010 -0400
+++ b/scripts/galaxy_messaging/server/galaxyweb_interface.py Wed Apr 21 16:42:09 2010 -0400
@@ -1,6 +1,5 @@
import ConfigParser
import sys, os
-import serial
import array
import time
import optparse,array
@@ -24,97 +23,81 @@
class GalaxyWebInterface(object):
- def __init__(self, server_host, server_port, datatx_email, datatx_password):
- self.server_host = server_host#config.get("main", "server_host")
- self.server_port = server_port#config.get("main", "server_port")
- self.datatx_email = datatx_email#config.get("main", "datatx_email")
- self.datatx_password = datatx_password#config.get("main", "datatx_password")
- try:
- # create url
- self.base_url = "http://%s:%s" % (self.server_host, self.server_port)
- # login
- url = "%s/user/login?email=%s&password=%s&login_button=Login" % (self.base_url, self.datatx_email, self.datatx_password)
- cj = cookielib.CookieJar()
- self.opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
- #print url
+ def __init__(self, server_host, server_port, datatx_email, datatx_password, config_id_secret):
+ self.server_host = server_host
+ self.server_port = server_port
+ self.datatx_email = datatx_email
+ self.datatx_password = datatx_password
+ self.config_id_secret = config_id_secret
+ # create url
+ self.base_url = "http://%s:%s" % (self.server_host, self.server_port)
+ # login
+ url = "%s/user/login?email=%s&password=%s&login_button=Login" % (self.base_url, self.datatx_email, self.datatx_password)
+ cj = cookielib.CookieJar()
+ self.opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
+ #print url
+ f = self.opener.open(url)
+ if f.read().find("ogged in as "+self.datatx_email) == -1:
+ # if the user doesnt exist, create the user
+ url = "%s/user/create?email=%s&username=%s&password=%s&confirm=%s&create_user_button=Submit" % ( self.base_url, self.datatx_email, self.datatx_email, self.datatx_password, self.datatx_password )
f = self.opener.open(url)
if f.read().find("ogged in as "+self.datatx_email) == -1:
- # if the user doesnt exist, create the user
- url = "%s/user/create?email=%s&username=%s&password=%s&confirm=%s&create_user_button=Submit" % ( self.base_url, self.datatx_email, self.datatx_email, self.datatx_password, self.datatx_password )
- f = self.opener.open(url)
- if f.read().find("ogged in as "+self.datatx_email) == -1:
- raise "The "+self.datatx_email+" user could not login to Galaxy"
- except:
- print traceback.format_exc()
- sys.exit(1)
+ raise Exception("The "+self.datatx_email+" user could not login to Galaxy")
def add_to_library(self, server_dir, library_id, folder_id, dbkey=''):
'''
This method adds the dataset file to the target data library & folder
by opening the corresponding url in Galaxy server running.
'''
- try:
- params = urllib.urlencode(dict( cntrller='library_admin',
- tool_id='upload1',
- tool_state='None',
- library_id=self.encode_id(library_id),
- folder_id=self.encode_id(folder_id),
- upload_option='upload_directory',
- file_type='auto',
- server_dir=os.path.basename(server_dir),
- dbkey=dbkey,
- show_dataset_id='True',
- runtool_btn='Upload to library'))
- #url = "http://localhost:8080/library_common/upload_library_dataset?cntrller=librar…"
- #url = base_url+"/library_common/upload_library_dataset?library_id=adb5f5c93f827949&tool_id=upload1&file_type=auto&server_dir=datatx_22858&dbkey=%3F&upload_option=upload_directory&folder_id=529fd61ab1c6cc36&cntrller=library_admin&tool_state=None&runtool_btn=Upload+to+library"
- url = self.base_url+"/library_common/upload_library_dataset"
- #print url
- #print params
- f = self.opener.open(url, params)
- if f.read().find("Data Library") == -1:
- raise "Dataset could not be uploaded to the data library"
- except:
- print traceback.format_exc()
- sys.exit(1)
+ params = urllib.urlencode(dict( cntrller='library_admin',
+ tool_id='upload1',
+ tool_state='None',
+ library_id=self.encode_id(library_id),
+ folder_id=self.encode_id(folder_id),
+ upload_option='upload_directory',
+ file_type='auto',
+ server_dir=os.path.basename(server_dir),
+ dbkey=dbkey,
+ show_dataset_id='True',
+ runtool_btn='Upload to library'))
+ #url = "http://localhost:8080/library_common/upload_library_dataset?cntrller=librar…"
+ #url = base_url+"/library_common/upload_library_dataset?library_id=adb5f5c93f827949&tool_id=upload1&file_type=auto&server_dir=datatx_22858&dbkey=%3F&upload_option=upload_directory&folder_id=529fd61ab1c6cc36&cntrller=library_admin&tool_state=None&runtool_btn=Upload+to+library"
+ url = self.base_url+"/library_common/upload_library_dataset"
+ #print url
+ #print params
+ f = self.opener.open(url, params)
+ if f.read().find("Data Library") == -1:
+ raise Exception("Dataset could not be uploaded to the data library. URL: %s, PARAMS=%s" % (url, params))
def import_to_history(self, ldda_id, library_id, folder_id):
- try:
- params = urllib.urlencode(dict( cntrller='library_admin',
- show_deleted='False',
- library_id=self.encode_id(library_id),
- folder_id=self.encode_id(folder_id),
- ldda_ids=self.encode_id(ldda_id),
- do_action='import_to_history',
- use_panels='False'))
- #url = "http://lion.bx.psu.edu:8080/library_common/act_on_multiple_datasets?library…"
- #url = base_url+"/library_common/upload_library_dataset?library_id=adb5f5c93f827949&tool_id=upload1&file_type=auto&server_dir=datatx_22858&dbkey=%3F&upload_option=upload_directory&folder_id=529fd61ab1c6cc36&cntrller=library_admin&tool_state=None&runtool_btn=Upload+to+library"
- url = self.base_url+"/library_common/act_on_multiple_datasets"
- #print url
- #print params
- f = self.opener.open(url, params)
- x = f.read()
- if x.find("1 dataset(s) have been imported into your history.") == -1:
- #print x
- raise Exception("Dataset could not be imported into history")
- except:
- print traceback.format_exc()
- sys.exit(1)
-
+ params = urllib.urlencode(dict( cntrller='library_admin',
+ show_deleted='False',
+ library_id=self.encode_id(library_id),
+ folder_id=self.encode_id(folder_id),
+ ldda_ids=self.encode_id(ldda_id),
+ do_action='import_to_history',
+ use_panels='False'))
+ #url = "http://lion.bx.psu.edu:8080/library_common/act_on_multiple_datasets?library…"
+ #url = base_url+"/library_common/upload_library_dataset?library_id=adb5f5c93f827949&tool_id=upload1&file_type=auto&server_dir=datatx_22858&dbkey=%3F&upload_option=upload_directory&folder_id=529fd61ab1c6cc36&cntrller=library_admin&tool_state=None&runtool_btn=Upload+to+library"
+ url = self.base_url+"/library_common/act_on_multiple_datasets"
+ #print url
+ #print params
+ f = self.opener.open(url, params)
+ x = f.read()
+ if x.find("1 dataset(s) have been imported into your history.") == -1:
+ #print x
+ raise Exception("Dataset could not be imported into history")
def run_workflow(self, workflow_id, hid, workflow_step):
input = str(workflow_step)+'|input'
- try:
- params = urllib.urlencode({'id':self.encode_id(workflow_id),
- 'run_workflow': 'Run workflow',
- input: hid})
- url = self.base_url+"/workflow/run"
- #print url+'?'+params
- f = self.opener.open(url, params)
+ params = urllib.urlencode({'id':self.encode_id(workflow_id),
+ 'run_workflow': 'Run workflow',
+ input: hid})
+ url = self.base_url+"/workflow/run"
+ #print url+'?'+params
+ f = self.opener.open(url, params)
# if f.read().find("1 dataset(s) have been imported into your history.") == -1:
# raise Exception("Error in running the workflow")
- except:
- print traceback.format_exc()
- sys.exit(1)
def logout(self):
@@ -122,8 +105,7 @@
f = self.opener.open(self.base_url+'/user/logout')
def encode_id(self, obj_id ):
- id_secret = 'changethisinproductiontoo'
- id_cipher = Blowfish.new( id_secret )
+ id_cipher = Blowfish.new( self.config_id_secret )
# Convert to string
s = str( obj_id )
# Pad to a multiple of 8 with leading "!"
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/e600ab3fadc1
changeset: 3677:e600ab3fadc1
user: rc
date: Wed Apr 21 11:44:19 2010 -0400
description:
fixed the amqp config bug
diffstat:
lib/galaxy/config.py | 6 +++++-
1 files changed, 5 insertions(+), 1 deletions(-)
diffs (16 lines):
diff -r afbdedd0e758 -r e600ab3fadc1 lib/galaxy/config.py
--- a/lib/galaxy/config.py Wed Apr 21 11:42:50 2010 -0400
+++ b/lib/galaxy/config.py Wed Apr 21 11:44:19 2010 -0400
@@ -125,7 +125,11 @@
self.enable_cloud_execution = string_as_bool( kwargs.get( 'enable_cloud_execution', 'False' ) )
# Galaxy messaging (AMQP) configuration options
self.amqp = {}
- for k, v in global_conf_parser.items("galaxy_amqp"):
+ try:
+ amqp_config = global_conf_parser.items("galaxy_amqp")
+ except ConfigParser.NoSectionError:
+ amqp_config = {}
+ for k, v in amqp_config:
self.amqp[k] = v
def get( self, key, default ):
return self.config_dict.get( key, default )
1
0

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/afbdedd0e758
changeset: 3676:afbdedd0e758
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Wed Apr 21 11:42:50 2010 -0400
description:
Cufflinks tools update; added cuffdiff wrapper.
diffstat:
tool_conf.xml.sample | 2 +
tools/ngs_rna/cuffcompare_wrapper.xml | 14 +-
tools/ngs_rna/cuffdiff_wrapper.py | 129 ++++++++++++++++++++++++++++++++++
tools/ngs_rna/cuffdiff_wrapper.xml | 117 ++++++++++++++++++++++++++++++
tools/ngs_rna/cufflinks_wrapper.py | 2 +-
tools/ngs_rna/cufflinks_wrapper.xml | 22 +++--
6 files changed, 270 insertions(+), 16 deletions(-)
diffs (367 lines):
diff -r d6fddb034db7 -r afbdedd0e758 tool_conf.xml.sample
--- a/tool_conf.xml.sample Wed Apr 21 11:35:21 2010 -0400
+++ b/tool_conf.xml.sample Wed Apr 21 11:42:50 2010 -0400
@@ -228,6 +228,8 @@
<section name="NGS: Expression Analysis" id="ngs-rna-tools">
<tool file="ngs_rna/tophat_wrapper.xml" />
<tool file="ngs_rna/cufflinks_wrapper.xml" />
+ <tool file="ngs_rna/cuffcompare_wrapper.xml" />
+ <tool file="ngs_rna/cuffdiff_wrapper.xml" />
</section>
<section name="NGS: SAM Tools" id="samtools">
<tool file="samtools/sam_bitwise_flag_filter.xml" />
diff -r d6fddb034db7 -r afbdedd0e758 tools/ngs_rna/cuffcompare_wrapper.xml
--- a/tools/ngs_rna/cuffcompare_wrapper.xml Wed Apr 21 11:35:21 2010 -0400
+++ b/tools/ngs_rna/cuffcompare_wrapper.xml Wed Apr 21 11:42:50 2010 -0400
@@ -15,15 +15,15 @@
$input2
</command>
<inputs>
- <param format="gtf" name="input1" type="data" label="SAM file of aligned RNA-Seq reads" help=""/>
- <param format="gtf" name="input2" type="data" label="SAM file of aligned RNA-Seq reads" help=""/>
+ <param format="gtf" name="input1" type="data" label="GTF file produced by Cufflinks" help=""/>
+ <param format="gtf" name="input2" type="data" label="GTF file produced by Cufflinks" help=""/>
<conditional name="annotation">
- <param name="use_ref_annotation" type="select" label="Use Reference Annotation?">
+ <param name="use_ref_annotation" type="select" label="Use Reference Annotation">
<option value="No">No</option>
<option value="Yes">Yes</option>
</param>
<when value="Yes">
- <param format="gtf" name="reference_annotation" type="data" label="Reference Annotation" help=""/>
+ <param format="gtf" name="reference_annotation" type="data" label="Reference Annotation" help="Make sure your annotation file is in GTF format and that Galaxy knows that your file is GTF--not GFF."/>
<param name="ignore_nonoverlapping_reference" type="boolean" label="Ignore reference transcripts that are not overlapped by any transcript in input files"/>
</when>
<when value="No">
@@ -32,9 +32,9 @@
</inputs>
<outputs>
- <data format="gtf" name="transcripts_combined" />
- <data format="tracking" name="transcripts_tracking" />
- <data format="gtf" name="transcripts_accuracy" />
+ <data format="gtf" name="transcripts_combined" label="Cuffcompare on data ${input1.hid} and data ${input2.hid}: combined transcripts"/>
+ <data format="tracking" name="transcripts_tracking" label="Cuffcompare on data ${input1.hid} and data ${input2.hid}: transcript tracking"/>
+ <data format="gtf" name="transcripts_accuracy" label="Cuffcompare on data ${input1.hid} and data ${input2.hid}: transcript accuracy"/>
</outputs>
<tests>
diff -r d6fddb034db7 -r afbdedd0e758 tools/ngs_rna/cuffdiff_wrapper.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/ngs_rna/cuffdiff_wrapper.py Wed Apr 21 11:42:50 2010 -0400
@@ -0,0 +1,129 @@
+#!/usr/bin/env python
+
+import optparse, os, shutil, subprocess, sys, tempfile
+
+def stop_err( msg ):
+ sys.stderr.write( "%s\n" % msg )
+ sys.exit()
+
+def __main__():
+ #Parse Command Line
+ parser = optparse.OptionParser()
+
+ # Cuffdiff options.
+ parser.add_option( '-s', '--inner-dist-std-dev', dest='inner_dist_std_dev', help='The standard deviation for the distribution on inner distances between mate pairs. The default is 20bp.' )
+ parser.add_option( '-p', '--num-threads', dest='num_threads', help='Use this many threads to align reads. The default is 1.' )
+ parser.add_option( '-m', '--inner-mean-dist', dest='inner_mean_dist', help='This is the expected (mean) inner distance between mate pairs. \
+ For, example, for paired end runs with fragments selected at 300bp, \
+ where each end is 50bp, you should set -r to be 200. The default is 45bp.')
+ parser.add_option( '-Q', '--min-mapqual', dest='min_mapqual', help='Instructs Cufflinks to ignore alignments with a SAM mapping quality lower than this number. The default is 0.' )
+ parser.add_option( '-c', '--min-alignment-count', dest='min_alignment_count', help='The minimum number of alignments in a locus for needed to conduct significance testing on changes in that locus observed between samples. If no testing is performed, changes in the locus are deemed not signficant, and the locus\' observed changes don\'t contribute to correction for multiple testing. The default is 1,000 fragment alignments (up to 2,000 paired reads).' )
+ parser.add_option( '--FDR', dest='FDR', help='The allowed false discovery rate. The default is 0.05.' )
+
+ # Advanced Options:
+ parser.add_option( '--num-importance-samples', dest='num_importance_samples', help='Sets the number of importance samples generated for each locus during abundance estimation. Default: 1000' )
+ parser.add_option( '--max-mle-iterations', dest='max_mle_iterations', help='Sets the number of iterations allowed during maximum likelihood estimation of abundances. Default: 5000' )
+
+ # Wrapper / Galaxy options.
+ parser.add_option( '-A', '--inputA', dest='inputA', help='A transcript GTF file produced by cufflinks, cuffcompare, or other source.')
+ parser.add_option( '-1', '--input1', dest='input1', help='File of RNA-Seq read alignments in the SAM format. SAM is a standard short read alignment, that allows aligners to attach custom tags to individual alignments, and Cufflinks requires that the alignments you supply have some of these tags. Please see Input formats for more details.' )
+ parser.add_option( '-2', '--input2', dest='input2', help='File of RNA-Seq read alignments in the SAM format. SAM is a standard short read alignment, that allows aligners to attach custom tags to individual alignments, and Cufflinks requires that the alignments you supply have some of these tags. Please see Input formats for more details.' )
+
+ parser.add_option( "--isoforms_fpkm_tracking_output", dest="isoforms_fpkm_tracking_output" )
+ parser.add_option( "--genes_fpkm_tracking_output", dest="genes_fpkm_tracking_output" )
+ parser.add_option( "--cds_fpkm_tracking_output", dest="cds_fpkm_tracking_output" )
+ parser.add_option( "--tss_groups_fpkm_tracking_output", dest="tss_groups_fpkm_tracking_output" )
+ parser.add_option( "--isoforms_exp_output", dest="isoforms_exp_output" )
+ parser.add_option( "--genes_exp_output", dest="genes_exp_output" )
+ parser.add_option( "--tss_groups_exp_output", dest="tss_groups_exp_output" )
+ parser.add_option( "--cds_exp_fpkm_tracking_output", dest="cds_exp_fpkm_tracking_output" )
+ parser.add_option( "--splicing_diff_output", dest="splicing_diff_output" )
+ parser.add_option( "--cds_diff_output", dest="cds_diff_output" )
+ parser.add_option( "--promoters_diff_output", dest="promoters_diff_output" )
+
+ (options, args) = parser.parse_args()
+
+ # Make temp directory for output.
+ tmp_output_dir = tempfile.mkdtemp()
+
+ # Build command.
+
+ # Base.
+ cmd = "cuffdiff"
+
+ # Add options.
+ if options.inner_dist_std_dev:
+ cmd += ( " -s %i" % int ( options.inner_dist_std_dev ) )
+ if options.num_threads:
+ cmd += ( " -p %i" % int ( options.num_threads ) )
+ if options.inner_mean_dist:
+ cmd += ( " -m %i" % int ( options.inner_mean_dist ) )
+ if options.min_mapqual:
+ cmd += ( " -Q %i" % int ( options.min_mapqual ) )
+ if options.min_alignment_count:
+ cmd += ( " -c %i" % int ( options.min_alignment_count ) )
+ if options.FDR:
+ cmd += ( " --FDR %f" % float( options.FDR ) )
+ if options.num_importance_samples:
+ cmd += ( " --num-importance-samples %i" % int ( options.num_importance_samples ) )
+ if options.max_mle_iterations:
+ cmd += ( " --max-mle-iterations %i" % int ( options.max_mle_iterations ) )
+
+ # Add inputs.
+ cmd += " " + options.inputA + " " + options.input1 + " " + options.input2
+ print cmd
+
+ # Run command.
+ try:
+ tmp_name = tempfile.NamedTemporaryFile( dir=tmp_output_dir ).name
+ tmp_stderr = open( tmp_name, 'wb' )
+ proc = subprocess.Popen( args=cmd, shell=True, cwd=tmp_output_dir, stderr=tmp_stderr.fileno() )
+ returncode = proc.wait()
+ tmp_stderr.close()
+
+ # Get stderr, allowing for case where it's very large.
+ tmp_stderr = open( tmp_name, 'rb' )
+ stderr = ''
+ buffsize = 1048576
+ try:
+ while True:
+ stderr += tmp_stderr.read( buffsize )
+ if not stderr or len( stderr ) % buffsize != 0:
+ break
+ except OverflowError:
+ pass
+ tmp_stderr.close()
+
+ # Error checking.
+ if returncode != 0:
+ raise Exception, stderr
+
+ # check that there are results in the output file
+ if len( open( tmp_output_dir + "/isoforms.fpkm_tracking", 'rb' ).read().strip() ) == 0:
+ raise Exception, 'The main output file is empty, there may be an error with your input file or settings.'
+ except Exception, e:
+ stop_err( 'Error running cuffdiff. ' + str( e ) )
+
+
+ # Copy output files from tmp directory to specified files.
+ try:
+ try:
+ shutil.copyfile( tmp_output_dir + "/isoforms.fpkm_tracking", options.isoforms_fpkm_tracking_output )
+ shutil.copyfile( tmp_output_dir + "/genes.fpkm_tracking", options.genes_fpkm_tracking_output )
+ shutil.copyfile( tmp_output_dir + "/cds.fpkm_tracking", options.cds_fpkm_tracking_output )
+ shutil.copyfile( tmp_output_dir + "/tss_groups.fpkm_tracking", options.tss_groups_fpkm_tracking_output )
+ shutil.copyfile( tmp_output_dir + "/0_1_isoform_exp.diff", options.isoforms_exp_output )
+ shutil.copyfile( tmp_output_dir + "/0_1_gene_exp.diff", options.genes_exp_output )
+ shutil.copyfile( tmp_output_dir + "/0_1_tss_group_exp.diff", options.tss_groups_exp_output )
+ shutil.copyfile( tmp_output_dir + "/0_1_splicing.diff", options.splicing_diff_output )
+ shutil.copyfile( tmp_output_dir + "/0_1_cds.diff", options.cds_diff_output )
+ shutil.copyfile( tmp_output_dir + "/0_1_cds_exp.diff", options.cds_diff_output )
+ shutil.copyfile( tmp_output_dir + "/0_1_promoters.diff", options.promoters_diff_output )
+ except Exception, e:
+ stop_err( 'Error in cuffdiff:\n' + str( e ) )
+ finally:
+ # Clean up temp dirs
+ if os.path.exists( tmp_output_dir ):
+ shutil.rmtree( tmp_output_dir )
+
+if __name__=="__main__": __main__()
\ No newline at end of file
diff -r d6fddb034db7 -r afbdedd0e758 tools/ngs_rna/cuffdiff_wrapper.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/ngs_rna/cuffdiff_wrapper.xml Wed Apr 21 11:42:50 2010 -0400
@@ -0,0 +1,117 @@
+<tool id="cuffdiff" name="Cuffdiff" version="0.8.2">
+ <description>find significant changes in transcript expression, splicing, and promoter use</description>
+ <command interpreter="python">
+ cuffdiff_wrapper.py
+ --FDR=$fdr
+ --num-threads="4"
+ --min-mapqual=$min_mapqual
+ --min-alignment-count=$min_alignment_count
+
+ --isoforms_fpkm_tracking_output=$isoforms_fpkm_tracking
+ --genes_fpkm_tracking_output=$genes_fpkm_tracking
+ --cds_fpkm_tracking_output=$cds_fpkm_tracking
+ --tss_groups_fpkm_tracking_output=$tss_groups_fpkm_tracking
+ --isoforms_exp_output=$isoforms_exp
+ --genes_exp_output=$genes_exp
+ --tss_groups_exp_output=$tss_groups_exp
+ --cds_exp_fpkm_tracking_output=$cds_exp_fpkm_tracking
+ --splicing_diff_output=$splicing_diff
+ --cds_diff_output=$cds_diff
+ --promoters_diff_output=$promoters_diff
+
+ --inputA=$gtf_input
+ --input1=$aligned_reads1
+ --input2=$aligned_reads2
+ </command>
+ <inputs>
+ <param format="gtf" name="gtf_input" type="data" label="Transcripts" help="A transcript GTF file produced by cufflinks, cuffcompare, or other source."/>
+ <param format="sam" name="aligned_reads1" type="data" label="SAM file of aligned RNA-Seq reads" help=""/>
+ <param format="sam" name="aligned_reads2" type="data" label="SAM file of aligned RNA-Seq reads" help=""/>
+ <param name="fdr" type="float" value="0.05" label="False Discovery Rate" help="The allowed false discovery rate."/>
+ <param name="min_mapqual" type="integer" value="0" label="Min SAM Mapping Quality" help="Instructs Cufflinks to ignore alignments with a SAM mapping quality lower than this number."/>
+ <param name="min_alignment_count" type="integer" value="0" label="Min Alignment Count" help="The minimum number of alignments in a locus for needed to conduct significance testing on changes in that locus observed between samples."/>
+ <conditional name="singlePaired">
+ <param name="sPaired" type="select" label="Is this library mate-paired?">
+ <option value="single">Single-end</option>
+ <option value="paired">Paired-end</option>
+ </param>
+ <when value="single"></when>
+ <when value="paired">
+ <param name="mean_inner_distance" type="integer" value="20" label="Mean Inner Distance between Mate Pairs"/>
+ <param name="inner_distance_std_dev" type="integer" value="20" label="Standard Deviation for Inner Distance between Mate Pairs"/>
+ </when>
+ </conditional>
+ </inputs>
+
+ <outputs>
+ <data format="tabular" name="isoforms_exp" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: isoform expression"/>
+ <data format="tabular" name="genes_exp" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: gene expression"/>
+ <data format="tabular" name="tss_groups_exp" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: TSS groups expression"/>
+ <data format="tabular" name="cds_exp_fpkm_tracking" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: CDS Expression FPKM Tracking"/>
+ <data format="tabular" name="splicing_diff" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: splicing diff"/>
+ <data format="tabular" name="cds_diff" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: CDS diff"/>
+ <data format="tabular" name="promoters_diff" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: promoters diff"/>
+ <data format="tabular" name="tss_groups_fpkm_tracking" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: TSS groups FPKM tracking" />
+ <data format="tabular" name="cds_fpkm_tracking" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: CDS FPKM tracking"/>
+ <data format="tabular" name="genes_fpkm_tracking" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: gene FPKM tracking"/>
+ <data format="tabular" name="isoforms_fpkm_tracking" label="Cuffdiff on data ${gtf_input.hid}, data ${aligned_reads1.hid}, and data ${aligned_reads2.hid}: isoform FPKM tracking"/>
+ </outputs>
+
+ <tests>
+ <test>
+ </test>
+ </tests>
+
+ <help>
+**Cuffdiff Overview**
+
+Cuffdiff is part of Cufflinks_. Cuffdiff find significant changes in transcript expression, splicing, and promoter use. Please cite: Trapnell C, Williams BA, Pertea G, Mortazavi AM, Kwan G, van Baren MJ, Salzberg SL, Wold B, Pachter L. Transcript assembly and abundance estimation from RNA-Seq reveals thousands of new transcripts and switching among isoforms. (manuscript in press)
+
+.. _Cufflinks: http://cufflinks.cbcb.umd.edu/
+
+------
+
+**Know what you are doing**
+
+.. class:: warningmark
+
+There is no such thing (yet) as an automated gearshift in expression analysis. It is all like stick-shift driving in San Francisco. In other words, running this tool with default parameters will probably not give you meaningful results. A way to deal with this is to **understand** the parameters by carefully reading the `documentation`__ and experimenting. Fortunately, Galaxy makes experimenting easy.
+
+.. __: http://cufflinks.cbcb.umd.edu/manual.html#cuffdiff
+
+------
+
+**Input format**
+
+Cuffcompare takes Cufflinks or Cuffcompare GTF files as input along with two SAM files containing the fragment alignments for two or more samples.
+
+.. ___: http://www.todo.org
+
+------
+
+**Outputs**
+
+TODO
+
+-------
+
+**Settings**
+
+All of the options have a default value. You can change any of them. Most of the options in Cuffdiff have been implemented here.
+
+------
+
+**Cuffdiff parameter list**
+
+This is a list of implemented Cuffdiff options::
+
+ -m INT This is the expected (mean) inner distance between mate pairs. For, example, for paired end runs with fragments selected at 300bp, where each end is 50bp, you should set -r to be 200. The default is 45bp.
+ -s INT The standard deviation for the distribution on inner distances between mate pairs. The default is 20bp.
+ -Q Instructs Cufflinks to ignore alignments with a SAM mapping quality lower than this number. The default is 0.
+ -c INT The minimum number of alignments in a locus for needed to conduct significance testing on changes in that locus observed between samples. If no testing is performed, changes in the locus are deemed not signficant, and the locus' observed changes don't contribute to correction for multiple testing. The default is 1,000 fragment alignments (up to 2,000 paired reads).
+ --FDR FLOAT The allowed false discovery rate. The default is 0.05.
+ --num-importance-samples INT Sets the number of importance samples generated for each locus during abundance estimation. Default: 1000
+ --max-mle-iterations INT Sets the number of iterations allowed during maximum likelihood estimation of abundances. Default: 5000
+
+ </help>
+</tool>
diff -r d6fddb034db7 -r afbdedd0e758 tools/ngs_rna/cufflinks_wrapper.py
--- a/tools/ngs_rna/cufflinks_wrapper.py Wed Apr 21 11:35:21 2010 -0400
+++ b/tools/ngs_rna/cufflinks_wrapper.py Wed Apr 21 11:42:50 2010 -0400
@@ -56,7 +56,7 @@
if options.min_mapqual:
cmd += ( " -Q %i" % int ( options.min_mapqual ) )
if options.GTF:
- cmd += ( " -G %i" % options.GTF )
+ cmd += ( " -G %s" % options.GTF )
if options.num_importance_samples:
cmd += ( " --num-importance-samples %i" % int ( options.num_importance_samples ) )
if options.max_mle_iterations:
diff -r d6fddb034db7 -r afbdedd0e758 tools/ngs_rna/cufflinks_wrapper.xml
--- a/tools/ngs_rna/cufflinks_wrapper.xml Wed Apr 21 11:35:21 2010 -0400
+++ b/tools/ngs_rna/cufflinks_wrapper.xml Wed Apr 21 11:42:50 2010 -0400
@@ -1,5 +1,5 @@
<tool id="cufflinks" name="Cufflinks" version="0.8.2">
- <description>transcript assembly, differential expression, and differential regulation for RNA-Seq</description>
+ <description>transcript assembly and FPKM (RPKM) estimates for RNA-Seq data</description>
<command interpreter="python">
cufflinks_wrapper.py
--input=$input
@@ -32,7 +32,7 @@
</param>
<when value="No"></when>
<when value="Yes">
- <param format="gtf" name="reference_annotation_file" type="data" label="Reference Annotation" help=""/>
+ <param format="gtf" name="reference_annotation_file" type="data" label="Reference Annotation" help="Make sure your annotation file is in GTF format and that Galaxy knows that your file is GTF--not GFF."/>
</when>
</conditional>
<conditional name="singlePaired">
@@ -50,9 +50,9 @@
</inputs>
<outputs>
- <data format="expr" name="genes_expression" />
- <data format="expr" name="transcripts_expression" />
- <data format="gtf" name="assembled_isoforms" />
+ <data format="expr" name="genes_expression" label="Cufflinks on data ${input.hid}: gene expression"/>
+ <data format="expr" name="transcripts_expression" label="Cufflinks on data ${input.hid}: transcript expression"/>
+ <data format="gtf" name="assembled_isoforms" label="Cufflinks on data ${input.hid}: assembled transcripts"/>
</outputs>
<tests>
@@ -60,10 +60,16 @@
<param name="sPaired" value="single"/>
<param name="input" value="cufflinks_in.sam"/>
<param name="mean_inner_distance" value="20"/>
+ <param name="max_intron_len" value="300000"/>
+ <param name="min_isoform_fraction" value="0.05"/>
+ <param name="pre_mrna_fraction" value="0.05"/>
+ <param name="min_map_quality" value="0"/>
+ <param name="use_ref" value="No"/>
<output name="assembled_isoforms" file="cufflinks_out1.gtf"/>
- <!-- Can't test these right now because .expr files aren't recognized.
- <output name="genes_expression" file="cufflinks_out3.expr"/>
- <output name="transcripts_expression" file="cufflinks_out2.expr"/>
+ <!--
+ Can't test these right now b/c .expr files aren't recognized. Need to add them?
+ <output name="genes_expression" format="tabular" file="cufflinks_out3.expr"/>
+ <output name="transcripts_expression" format="tabular" file="cufflinks_out2.expr"/>
-->
</test>
</tests>
1
0

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/d6fddb034db7
changeset: 3675:d6fddb034db7
user: Greg Von Kuster <greg(a)bx.psu.edu>
date: Wed Apr 21 11:35:21 2010 -0400
description:
Enable the the grid helper and base grid templates to be used across webapps. Decouple the base controller from the model, controllers that subclass from the base contrller must now import a model. Eliminate the base controller from the community now that the base contrller can be used across webapps. Add a new admin controller grid to the community space. Move the Admin controller to ~/base/controller.py and subclass the 2 admin controller grids ( galaxy and community ) from it. Add the group components to the community.
diffstat:
community_wsgi.ini.sample | 3 +
lib/galaxy/web/base/controller.py | 1195 +++++++++-
lib/galaxy/web/controllers/admin.py | 900 +-------
lib/galaxy/web/controllers/dataset.py | 6 +-
lib/galaxy/web/controllers/forms.py | 2 +-
lib/galaxy/web/controllers/history.py | 10 +-
lib/galaxy/web/controllers/library_admin.py | 2 +-
lib/galaxy/web/controllers/page.py | 15 +-
lib/galaxy/web/controllers/requests.py | 2 +-
lib/galaxy/web/controllers/requests_admin.py | 2 +-
lib/galaxy/web/controllers/root.py | 6 +-
lib/galaxy/web/controllers/tag.py | 2 +-
lib/galaxy/web/controllers/tracks.py | 1 +
lib/galaxy/web/controllers/user.py | 5 +-
lib/galaxy/web/controllers/visualization.py | 3 +-
lib/galaxy/web/controllers/workflow.py | 15 +-
lib/galaxy/web/framework/helpers/grids.py | 21 +-
lib/galaxy/webapps/community/base/controller.py | 24 -
lib/galaxy/webapps/community/buildapp.py | 6 +-
lib/galaxy/webapps/community/config.py | 7 +-
lib/galaxy/webapps/community/controllers/admin.py | 285 ++
lib/galaxy/webapps/community/controllers/tool_browser.py | 2 +-
lib/galaxy/webapps/community/model/__init__.py | 23 +-
lib/galaxy/webapps/community/model/mapping.py | 48 +-
lib/galaxy/webapps/community/model/migrate/versions/0001_initial_tables.py | 29 +-
lib/galaxy/webapps/community/security/__init__.py | 50 +-
templates/admin/dataset_security/group/group.mako | 1 +
templates/admin/dataset_security/group/group_create.mako | 1 +
templates/admin/dataset_security/group/group_rename.mako | 1 +
templates/admin/dataset_security/role/role.mako | 1 +
templates/admin/dataset_security/role/role_create.mako | 1 +
templates/admin/dataset_security/role/role_rename.mako | 1 +
templates/admin/index.mako | 145 -
templates/admin/user/reset_password.mako | 1 +
templates/admin/user/user.mako | 1 +
templates/grid_base.mako | 26 +-
templates/user/info.mako | 4 +-
templates/webapps/community/admin/index.mako | 93 +
templates/webapps/community/base_panels.mako | 2 +-
templates/webapps/galaxy/admin/index.mako | 139 +
test/base/twilltestcase.py | 12 +-
41 files changed, 1930 insertions(+), 1163 deletions(-)
diffs (truncated from 4111 to 3000 lines):
diff -r 076f572d7c9d -r d6fddb034db7 community_wsgi.ini.sample
--- a/community_wsgi.ini.sample Wed Apr 21 10:41:30 2010 -0400
+++ b/community_wsgi.ini.sample Wed Apr 21 11:35:21 2010 -0400
@@ -46,6 +46,9 @@
# NEVER enable this on a public site (even test or QA)
# use_interactive = true
+# this should be a comma-separated list of valid Galaxy users
+#admin_users = user1@example.org,user2@example.org
+
# Force everyone to log in (disable anonymous access)
require_login = False
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/base/controller.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,11 +1,9 @@
"""
Contains functionality needed in every web interface
"""
-
-import os, time, logging, re
-
-# Pieces of Galaxy to make global in every controller
-from galaxy import config, tools, web, model, util
+import os, time, logging, re, string, sys
+from datetime import datetime, timedelta
+from galaxy import config, tools, web, util
from galaxy.web import error, form, url_for
from galaxy.model.orm import *
from galaxy.workflow.modules import *
@@ -24,27 +22,28 @@
"""
Base class for Galaxy web application controllers.
"""
-
def __init__( self, app ):
"""Initialize an interface for application 'app'"""
self.app = app
-
def get_toolbox(self):
"""Returns the application toolbox"""
return self.app.toolbox
-
- def get_class( self, class_name ):
+ def get_class( self, trans, class_name ):
""" Returns the class object that a string denotes. Without this method, we'd have to do eval(<class_name>). """
if class_name == 'History':
- item_class = model.History
+ item_class = trans.model.History
elif class_name == 'HistoryDatasetAssociation':
- item_class = model.HistoryDatasetAssociation
+ item_class = trans.model.HistoryDatasetAssociation
elif class_name == 'Page':
- item_class = model.Page
+ item_class = trans.model.Page
elif class_name == 'StoredWorkflow':
- item_class = model.StoredWorkflow
+ item_class = trans.model.StoredWorkflow
elif class_name == 'Visualization':
- item_class = model.Visualization
+ item_class = trans.model.Visualization
+ elif class_name == 'Tool':
+ # TODO: Nate, this one should be changed to whatever you end up calling
+ # the pointer to the tool archive.
+ item_class = trans.model.Tool
else:
item_class = None
return item_class
@@ -53,62 +52,56 @@
class UsesAnnotations:
""" Mixin for getting and setting item annotations. """
- def get_item_annotation_str( self, db_session, user, item ):
+ def get_item_annotation_str( self, trans, user, item ):
""" Returns a user's annotation string for an item. """
- annotation_obj = self.get_item_annotation_obj( db_session, user, item )
+ annotation_obj = self.get_item_annotation_obj( trans, user, item )
if annotation_obj:
return annotation_obj.annotation
return None
-
- def get_item_annotation_obj( self, db_session, user, item ):
+ def get_item_annotation_obj( self, trans, user, item ):
""" Returns a user's annotation object for an item. """
# Get annotation association.
try:
- annotation_assoc_class = eval( "model.%sAnnotationAssociation" % item.__class__.__name__ )
+ annotation_assoc_class = eval( "trans.model.%sAnnotationAssociation" % item.__class__.__name__ )
except:
# Item doesn't have an annotation association class and cannot be annotated.
return False
-
# Get annotation association object.
- annotation_assoc = db_session.query( annotation_assoc_class ).filter_by( user=user )
- if item.__class__ == model.History:
+ annotation_assoc = trans.sa_session.query( annotation_assoc_class ).filter_by( user=user )
+ if item.__class__ == trans.model.History:
annotation_assoc = annotation_assoc.filter_by( history=item )
- elif item.__class__ == model.HistoryDatasetAssociation:
+ elif item.__class__ == trans.model.HistoryDatasetAssociation:
annotation_assoc = annotation_assoc.filter_by( hda=item )
- elif item.__class__ == model.StoredWorkflow:
+ elif item.__class__ == trans.model.StoredWorkflow:
annotation_assoc = annotation_assoc.filter_by( stored_workflow=item )
- elif item.__class__ == model.WorkflowStep:
+ elif item.__class__ == trans.model.WorkflowStep:
annotation_assoc = annotation_assoc.filter_by( workflow_step=item )
- elif item.__class__ == model.Page:
+ elif item.__class__ == trans.model.Page:
annotation_assoc = annotation_assoc.filter_by( page=item )
- elif item.__class__ == model.Visualization:
+ elif item.__class__ == trans.model.Visualization:
annotation_assoc = annotation_assoc.filter_by( visualization=item )
return annotation_assoc.first()
-
def add_item_annotation( self, trans, item, annotation ):
""" Add or update an item's annotation; a user can only have a single annotation for an item. """
-
# Get/create annotation association object.
- annotation_assoc = self.get_item_annotation_obj( trans.sa_session, trans.get_user(), item )
+ annotation_assoc = self.get_item_annotation_obj( trans, trans.user, item )
if not annotation_assoc:
# Create association.
# TODO: we could replace this eval() with a long if/else stmt, but this is more general without sacrificing
try:
- annotation_assoc_class = eval( "model.%sAnnotationAssociation" % item.__class__.__name__ )
+ annotation_assoc_class = eval( "trans.model.%sAnnotationAssociation" % item.__class__.__name__ )
except:
# Item doesn't have an annotation association class and cannot be annotated.
return False
annotation_assoc = annotation_assoc_class()
item.annotations.append( annotation_assoc )
annotation_assoc.user = trans.get_user()
-
# Set annotation.
annotation_assoc.annotation = annotation
return True
class SharableItemSecurity:
""" Mixin for handling security for sharable items. """
-
def security_check( self, user, item, check_ownership=False, check_accessible=False ):
""" Security checks for an item: checks if (a) user owns item or (b) item is accessible to user. """
if check_ownership:
@@ -125,7 +118,6 @@
class UsesHistoryDatasetAssociation:
""" Mixin for controllers that use HistoryDatasetAssociation objects. """
-
def get_dataset( self, trans, dataset_id, check_ownership=True, check_accessible=False ):
""" Get an HDA object by id. """
# DEPRECATION: We still support unencoded ids for backward compatibility
@@ -133,7 +125,7 @@
dataset_id = int( dataset_id )
except ValueError:
dataset_id = trans.security.decode_id( dataset_id )
- data = trans.sa_session.query( model.HistoryDatasetAssociation ).get( dataset_id )
+ data = trans.sa_session.query( trans.model.HistoryDatasetAssociation ).get( dataset_id )
if not data:
raise paste.httpexceptions.HTTPRequestRangeNotSatisfiable( "Invalid dataset id: %s." % str( dataset_id ) )
if check_ownership:
@@ -151,7 +143,6 @@
else:
error( "You are not allowed to access this dataset" )
return data
-
def get_data( self, dataset, preview=True ):
""" Gets a dataset's data. """
# Get data from file, truncating if necessary.
@@ -169,12 +160,11 @@
class UsesVisualization( SharableItemSecurity ):
""" Mixin for controllers that use Visualization objects. """
-
def get_visualization( self, trans, id, check_ownership=True, check_accessible=False ):
""" Get a Visualization from the database by id, verifying ownership. """
# Load workflow from database
id = trans.security.decode_id( id )
- visualization = trans.sa_session.query( model.Visualization ).get( id )
+ visualization = trans.sa_session.query( trans.model.Visualization ).get( id )
if not visualization:
error( "Visualization not found" )
else:
@@ -182,17 +172,15 @@
class UsesStoredWorkflow( SharableItemSecurity ):
""" Mixin for controllers that use StoredWorkflow objects. """
-
def get_stored_workflow( self, trans, id, check_ownership=True, check_accessible=False ):
""" Get a StoredWorkflow from the database by id, verifying ownership. """
# Load workflow from database
id = trans.security.decode_id( id )
- stored = trans.sa_session.query( model.StoredWorkflow ).get( id )
+ stored = trans.sa_session.query( trans.model.StoredWorkflow ).get( id )
if not stored:
error( "Workflow not found" )
else:
return self.security_check( trans.get_user(), stored, check_ownership, check_accessible )
-
def get_stored_workflow_steps( self, trans, stored_workflow ):
""" Restores states for a stored workflow's steps. """
for step in stored_workflow.latest_workflow.steps:
@@ -217,32 +205,29 @@
class UsesHistory( SharableItemSecurity ):
""" Mixin for controllers that use History objects. """
-
def get_history( self, trans, id, check_ownership=True, check_accessible=False ):
"""Get a History from the database by id, verifying ownership."""
# Load history from database
id = trans.security.decode_id( id )
- history = trans.sa_session.query( model.History ).get( id )
+ history = trans.sa_session.query( trans.model.History ).get( id )
if not history:
err+msg( "History not found" )
else:
return self.security_check( trans.get_user(), history, check_ownership, check_accessible )
-
def get_history_datasets( self, trans, history, show_deleted=False ):
""" Returns history's datasets. """
- query = trans.sa_session.query( model.HistoryDatasetAssociation ) \
- .filter( model.HistoryDatasetAssociation.history == history ) \
+ query = trans.sa_session.query( trans.model.HistoryDatasetAssociation ) \
+ .filter( trans.model.HistoryDatasetAssociation.history == history ) \
.options( eagerload( "children" ) ) \
- .join( "dataset" ).filter( model.Dataset.purged == False ) \
+ .join( "dataset" ).filter( trans.model.Dataset.purged == False ) \
.options( eagerload_all( "dataset.actions" ) ) \
- .order_by( model.HistoryDatasetAssociation.hid )
+ .order_by( trans.model.HistoryDatasetAssociation.hid )
if not show_deleted:
- query = query.filter( model.HistoryDatasetAssociation.deleted == False )
+ query = query.filter( trans.model.HistoryDatasetAssociation.deleted == False )
return query.all()
class Sharable:
""" Mixin for a controller that manages an item that can be shared. """
-
# Implemented methods.
@web.expose
@web.require_login( "share Galaxy items" )
@@ -251,52 +236,42 @@
trans.get_user().username = username
trans.sa_session.flush
return self.sharing( trans, id, **kwargs )
-
# Abstract methods.
-
@web.expose
@web.require_login( "modify Galaxy items" )
def set_slug_async( self, trans, id, new_slug ):
""" Set item slug asynchronously. """
- pass
-
+ pass
@web.expose
@web.require_login( "share Galaxy items" )
def sharing( self, trans, id, **kwargs ):
""" Handle item sharing. """
pass
-
@web.expose
@web.require_login( "share Galaxy items" )
def share( self, trans, id=None, email="", **kwd ):
""" Handle sharing an item with a particular user. """
pass
-
@web.expose
def display_by_username_and_slug( self, trans, username, slug ):
""" Display item by username and slug. """
pass
-
@web.expose
@web.json
@web.require_login( "get item name and link" )
def get_name_and_link_async( self, trans, id=None ):
""" Returns item's name and link. """
pass
-
@web.expose
@web.require_login("get item content asynchronously")
def get_item_content_async( self, trans, id ):
""" Returns item content in HTML format. """
pass
-
# Helper methods.
-
def _make_item_accessible( self, sa_session, item ):
""" Makes item accessible--viewable and importable--and sets item's slug. Does not flush/commit changes, however. Item must have name, user, importable, and slug attributes. """
item.importable = True
self.create_item_slug( sa_session, item )
-
def create_item_slug( self, sa_session, item ):
""" Create item slug. Slug is unique among user's importable items for item's class. Returns true if item's slug was set; false otherwise. """
if item.slug is None or item.slug == "":
@@ -312,7 +287,6 @@
# Remove trailing '-'.
if slug_base.endswith('-'):
slug_base = slug_base[:-1]
-
# Make sure that slug is not taken; if it is, add a number to it.
slug = slug_base
count = 1
@@ -322,12 +296,1103 @@
count += 1
item.slug = slug
return True
-
return False
"""
Deprecated: `BaseController` used to be available under the name `Root`
"""
+class ControllerUnavailable( Exception ):
+ pass
-class ControllerUnavailable( Exception ):
- pass
\ No newline at end of file
+class Admin():
+ # Override these
+ user_list_grid = None
+ role_list_grid = None
+ group_list_grid = None
+
+ @web.expose
+ @web.require_admin
+ def index( self, trans, **kwd ):
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ if webapp == 'galaxy':
+ return trans.fill_template( '/webapps/galaxy/admin/index.mako',
+ webapp=webapp,
+ message=message,
+ status=status )
+ else:
+ return trans.fill_template( '/webapps/community/admin/index.mako',
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def center( self, trans, **kwd ):
+ return trans.fill_template( '/admin/center.mako' )
+ @web.expose
+ @web.require_admin
+ def reload_tool( self, trans, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ return trans.fill_template( '/admin/reload_tool.mako',
+ toolbox=self.app.toolbox,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def tool_reload( self, trans, tool_version=None, **kwd ):
+ params = util.Params( kwd )
+ tool_id = params.tool_id
+ self.app.toolbox.reload( tool_id )
+ message = 'Reloaded tool: ' + tool_id
+ return trans.fill_template( '/admin/reload_tool.mako',
+ toolbox=self.app.toolbox,
+ message=message,
+ status='done' )
+
+ # Galaxy Role Stuff
+ @web.expose
+ @web.require_admin
+ def roles( self, trans, **kwargs ):
+ if 'operation' in kwargs:
+ operation = kwargs['operation'].lower()
+ if operation == "roles":
+ return self.role( trans, **kwargs )
+ if operation == "create":
+ return self.create_role( trans, **kwargs )
+ if operation == "delete":
+ return self.mark_role_deleted( trans, **kwargs )
+ if operation == "undelete":
+ return self.undelete_role( trans, **kwargs )
+ if operation == "purge":
+ return self.purge_role( trans, **kwargs )
+ if operation == "manage users and groups":
+ return self.manage_users_and_groups_for_role( trans, **kwargs )
+ if operation == "rename":
+ return self.rename_role( trans, **kwargs )
+ # Render the list view
+ return self.role_list_grid( trans, **kwargs )
+ @web.expose
+ @web.require_admin
+ def create_role( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ if params.get( 'create_role_button', False ):
+ name = util.restore_text( params.name )
+ description = util.restore_text( params.description )
+ in_users = util.listify( params.get( 'in_users', [] ) )
+ in_groups = util.listify( params.get( 'in_groups', [] ) )
+ create_group_for_role = params.get( 'create_group_for_role', 'no' )
+ if not name or not description:
+ message = "Enter a valid name and a description"
+ elif trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.name==name ).first():
+ message = "A role with that name already exists"
+ else:
+ # Create the role
+ role = trans.app.model.Role( name=name, description=description, type=trans.app.model.Role.types.ADMIN )
+ trans.sa_session.add( role )
+ # Create the UserRoleAssociations
+ for user in [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in in_users ]:
+ ura = trans.app.model.UserRoleAssociation( user, role )
+ trans.sa_session.add( ura )
+ # Create the GroupRoleAssociations
+ for group in [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in in_groups ]:
+ gra = trans.app.model.GroupRoleAssociation( group, role )
+ trans.sa_session.add( gra )
+ if create_group_for_role == 'yes':
+ # Create the group
+ group = trans.app.model.Group( name=name )
+ trans.sa_session.add( group )
+ message = "Group '%s' has been created, and role '%s' has been created with %d associated users and %d associated groups" % \
+ ( group.name, role.name, len( in_users ), len( in_groups ) )
+ else:
+ message = "Role '%s' has been created with %d associated users and %d associated groups" % ( role.name, len( in_users ), len( in_groups ) )
+ trans.sa_session.flush()
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='create_role',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ out_users = []
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
+ out_users.append( ( user.id, user.email ) )
+ out_groups = []
+ for group in trans.sa_session.query( trans.app.model.Group ) \
+ .filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name ):
+ out_groups.append( ( group.id, group.name ) )
+ return trans.fill_template( '/admin/dataset_security/role/role_create.mako',
+ in_users=[],
+ out_users=out_users,
+ in_groups=[],
+ out_groups=out_groups,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def rename_role( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ id = params.get( 'id', None )
+ if not id:
+ message = "No role ids received for renaming"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ role = get_role( trans, id )
+ if params.get( 'rename_role_button', False ):
+ old_name = role.name
+ new_name = util.restore_text( params.name )
+ new_description = util.restore_text( params.description )
+ if not new_name:
+ message = 'Enter a valid name'
+ status='error'
+ elif trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.name==new_name ).first():
+ message = 'A role with that name already exists'
+ status = 'error'
+ else:
+ role.name = new_name
+ role.description = new_description
+ trans.sa_session.add( role )
+ trans.sa_session.flush()
+ message = "Role '%s' has been renamed to '%s'" % ( old_name, new_name )
+ return trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ return trans.fill_template( '/admin/dataset_security/role/role_rename.mako',
+ role=role,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def manage_users_and_groups_for_role( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ id = params.get( 'id', None )
+ if not id:
+ message = "No role ids received for managing users and groups"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ role = get_role( trans, id )
+ if params.get( 'role_members_edit_button', False ):
+ in_users = [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in util.listify( params.in_users ) ]
+ for ura in role.users:
+ user = trans.sa_session.query( trans.app.model.User ).get( ura.user_id )
+ if user not in in_users:
+ # Delete DefaultUserPermissions for previously associated users that have been removed from the role
+ for dup in user.default_permissions:
+ if role == dup.role:
+ trans.sa_session.delete( dup )
+ # Delete DefaultHistoryPermissions for previously associated users that have been removed from the role
+ for history in user.histories:
+ for dhp in history.default_permissions:
+ if role == dhp.role:
+ trans.sa_session.delete( dhp )
+ trans.sa_session.flush()
+ in_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( params.in_groups ) ]
+ trans.app.security_agent.set_entity_role_associations( roles=[ role ], users=in_users, groups=in_groups )
+ trans.sa_session.refresh( role )
+ message = "Role '%s' has been updated with %d associated users and %d associated groups" % ( role.name, len( in_users ), len( in_groups ) )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status=status ) )
+ in_users = []
+ out_users = []
+ in_groups = []
+ out_groups = []
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
+ if user in [ x.user for x in role.users ]:
+ in_users.append( ( user.id, user.email ) )
+ else:
+ out_users.append( ( user.id, user.email ) )
+ for group in trans.sa_session.query( trans.app.model.Group ) \
+ .filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name ):
+ if group in [ x.group for x in role.groups ]:
+ in_groups.append( ( group.id, group.name ) )
+ else:
+ out_groups.append( ( group.id, group.name ) )
+ library_dataset_actions = {}
+ if webapp == 'galaxy':
+ # Build a list of tuples that are LibraryDatasetDatasetAssociationss followed by a list of actions
+ # whose DatasetPermissions is associated with the Role
+ # [ ( LibraryDatasetDatasetAssociation [ action, action ] ) ]
+ for dp in role.dataset_actions:
+ for ldda in trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ) \
+ .filter( trans.app.model.LibraryDatasetDatasetAssociation.dataset_id==dp.dataset_id ):
+ root_found = False
+ folder_path = ''
+ folder = ldda.library_dataset.folder
+ while not root_found:
+ folder_path = '%s / %s' % ( folder.name, folder_path )
+ if not folder.parent:
+ root_found = True
+ else:
+ folder = folder.parent
+ folder_path = '%s %s' % ( folder_path, ldda.name )
+ library = trans.sa_session.query( trans.app.model.Library ) \
+ .filter( trans.app.model.Library.table.c.root_folder_id == folder.id ) \
+ .first()
+ if library not in library_dataset_actions:
+ library_dataset_actions[ library ] = {}
+ try:
+ library_dataset_actions[ library ][ folder_path ].append( dp.action )
+ except:
+ library_dataset_actions[ library ][ folder_path ] = [ dp.action ]
+ return trans.fill_template( '/admin/dataset_security/role/role.mako',
+ role=role,
+ in_users=in_users,
+ out_users=out_users,
+ in_groups=in_groups,
+ out_groups=out_groups,
+ library_dataset_actions=library_dataset_actions,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def mark_role_deleted( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No role ids received for deleting"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Deleted %d roles: " % len( ids )
+ for role_id in ids:
+ role = get_role( trans, role_id )
+ role.deleted = True
+ trans.sa_session.add( role )
+ trans.sa_session.flush()
+ message += " %s " % role.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def undelete_role( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No role ids received for undeleting"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ count = 0
+ undeleted_roles = ""
+ for role_id in ids:
+ role = get_role( trans, role_id )
+ if not role.deleted:
+ message = "Role '%s' has not been deleted, so it cannot be undeleted." % role.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ role.deleted = False
+ trans.sa_session.add( role )
+ trans.sa_session.flush()
+ count += 1
+ undeleted_roles += " %s" % role.name
+ message = "Undeleted %d roles: %s" % ( count, undeleted_roles )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def purge_role( self, trans, **kwd ):
+ # This method should only be called for a Role that has previously been deleted.
+ # Purging a deleted Role deletes all of the following from the database:
+ # - UserRoleAssociations where role_id == Role.id
+ # - DefaultUserPermissions where role_id == Role.id
+ # - DefaultHistoryPermissions where role_id == Role.id
+ # - GroupRoleAssociations where role_id == Role.id
+ # - DatasetPermissionss where role_id == Role.id
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No role ids received for purging"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Purged %d roles: " % len( ids )
+ for role_id in ids:
+ role = get_role( trans, role_id )
+ if not role.deleted:
+ message = "Role '%s' has not been deleted, so it cannot be purged." % role.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ # Delete UserRoleAssociations
+ for ura in role.users:
+ user = trans.sa_session.query( trans.app.model.User ).get( ura.user_id )
+ # Delete DefaultUserPermissions for associated users
+ for dup in user.default_permissions:
+ if role == dup.role:
+ trans.sa_session.delete( dup )
+ # Delete DefaultHistoryPermissions for associated users
+ for history in user.histories:
+ for dhp in history.default_permissions:
+ if role == dhp.role:
+ trans.sa_session.delete( dhp )
+ trans.sa_session.delete( ura )
+ # Delete GroupRoleAssociations
+ for gra in role.groups:
+ trans.sa_session.delete( gra )
+ # Delete DatasetPermissionss
+ for dp in role.dataset_actions:
+ trans.sa_session.delete( dp )
+ trans.sa_session.flush()
+ message += " %s " % role.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='roles',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+
+ # Galaxy Group Stuff
+ @web.expose
+ @web.require_admin
+ def groups( self, trans, **kwargs ):
+ if 'operation' in kwargs:
+ operation = kwargs['operation'].lower()
+ if operation == "groups":
+ return self.group( trans, **kwargs )
+ if operation == "create":
+ return self.create_group( trans, **kwargs )
+ if operation == "delete":
+ return self.mark_group_deleted( trans, **kwargs )
+ if operation == "undelete":
+ return self.undelete_group( trans, **kwargs )
+ if operation == "purge":
+ return self.purge_group( trans, **kwargs )
+ if operation == "manage users and roles":
+ return self.manage_users_and_roles_for_group( trans, **kwargs )
+ if operation == "rename":
+ return self.rename_group( trans, **kwargs )
+ # Render the list view
+ return self.group_list_grid( trans, **kwargs )
+ @web.expose
+ @web.require_admin
+ def rename_group( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ id = params.get( 'id', None )
+ if not id:
+ message = "No group ids received for renaming"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ group = get_group( trans, id )
+ if params.get( 'rename_group_button', False ):
+ old_name = group.name
+ new_name = util.restore_text( params.name )
+ if not new_name:
+ message = 'Enter a valid name'
+ status = 'error'
+ elif trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.name==new_name ).first():
+ message = 'A group with that name already exists'
+ status = 'error'
+ else:
+ group.name = new_name
+ trans.sa_session.add( group )
+ trans.sa_session.flush()
+ message = "Group '%s' has been renamed to '%s'" % ( old_name, new_name )
+ return trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ return trans.fill_template( '/admin/dataset_security/group/group_rename.mako',
+ group=group,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def manage_users_and_roles_for_group( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ group = get_group( trans, params.id )
+ if params.get( 'group_roles_users_edit_button', False ):
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.in_roles ) ]
+ in_users = [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in util.listify( params.in_users ) ]
+ trans.app.security_agent.set_entity_group_associations( groups=[ group ], roles=in_roles, users=in_users )
+ trans.sa_session.refresh( group )
+ message += "Group '%s' has been updated with %d associated roles and %d associated users" % ( group.name, len( in_roles ), len( in_users ) )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status=status ) )
+ in_roles = []
+ out_roles = []
+ in_users = []
+ out_users = []
+ for role in trans.sa_session.query(trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name ):
+ if role in [ x.role for x in group.roles ]:
+ in_roles.append( ( role.id, role.name ) )
+ else:
+ out_roles.append( ( role.id, role.name ) )
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
+ if user in [ x.user for x in group.users ]:
+ in_users.append( ( user.id, user.email ) )
+ else:
+ out_users.append( ( user.id, user.email ) )
+ message += 'Group %s is currently associated with %d roles and %d users' % ( group.name, len( in_roles ), len( in_users ) )
+ return trans.fill_template( '/admin/dataset_security/group/group.mako',
+ group=group,
+ in_roles=in_roles,
+ out_roles=out_roles,
+ in_users=in_users,
+ out_users=out_users,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def create_group( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ if params.get( 'create_group_button', False ):
+ name = util.restore_text( params.name )
+ in_users = util.listify( params.get( 'in_users', [] ) )
+ in_roles = util.listify( params.get( 'in_roles', [] ) )
+ if not name:
+ message = "Enter a valid name"
+ elif trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.name==name ).first():
+ message = "A group with that name already exists"
+ else:
+ # Create the group
+ group = trans.app.model.Group( name=name )
+ trans.sa_session.add( group )
+ trans.sa_session.flush()
+ # Create the UserRoleAssociations
+ for user in [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in in_users ]:
+ uga = trans.app.model.UserGroupAssociation( user, group )
+ trans.sa_session.add( uga )
+ trans.sa_session.flush()
+ # Create the GroupRoleAssociations
+ for role in [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in in_roles ]:
+ gra = trans.app.model.GroupRoleAssociation( group, role )
+ trans.sa_session.add( gra )
+ trans.sa_session.flush()
+ message = "Group '%s' has been created with %d associated users and %d associated roles" % ( name, len( in_users ), len( in_roles ) )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='create_group',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ out_users = []
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
+ out_users.append( ( user.id, user.email ) )
+ out_roles = []
+ for role in trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name ):
+ out_roles.append( ( role.id, role.name ) )
+ return trans.fill_template( '/admin/dataset_security/group/group_create.mako',
+ in_users=[],
+ out_users=out_users,
+ in_roles=[],
+ out_roles=out_roles,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def mark_group_deleted( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = params.get( 'id', None )
+ if not id:
+ message = "No group ids received for marking deleted"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Deleted %d groups: " % len( ids )
+ for group_id in ids:
+ group = get_group( trans, group_id )
+ group.deleted = True
+ trans.sa_session.add( group )
+ trans.sa_session.flush()
+ message += " %s " % group.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def undelete_group( self, trans, **kwd ):
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No group ids received for undeleting"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ count = 0
+ undeleted_groups = ""
+ for group_id in ids:
+ group = get_group( trans, group_id )
+ if not group.deleted:
+ message = "Group '%s' has not been deleted, so it cannot be undeleted." % group.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ group.deleted = False
+ trans.sa_session.add( group )
+ trans.sa_session.flush()
+ count += 1
+ undeleted_groups += " %s" % group.name
+ message = "Undeleted %d groups: %s" % ( count, undeleted_groups )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def purge_group( self, trans, **kwd ):
+ # This method should only be called for a Group that has previously been deleted.
+ # Purging a deleted Group simply deletes all UserGroupAssociations and GroupRoleAssociations.
+ params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No group ids received for purging"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Purged %d groups: " % len( ids )
+ for group_id in ids:
+ group = get_group( trans, group_id )
+ if not group.deleted:
+ # We should never reach here, but just in case there is a bug somewhere...
+ message = "Group '%s' has not been deleted, so it cannot be purged." % group.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ # Delete UserGroupAssociations
+ for uga in group.users:
+ trans.sa_session.delete( uga )
+ # Delete GroupRoleAssociations
+ for gra in group.roles:
+ trans.sa_session.delete( gra )
+ trans.sa_session.flush()
+ message += " %s " % group.name
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='groups',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+
+ # Galaxy User Stuff
+ @web.expose
+ @web.require_admin
+ def create_new_user( self, trans, **kwargs ):
+ webapp = kwargs.get( 'webapp', 'galaxy' )
+ return trans.response.send_redirect( web.url_for( controller='user',
+ action='create',
+ webapp=webapp,
+ admin_view=True ) )
+ @web.expose
+ @web.require_admin
+ def reset_user_password( self, trans, **kwd ):
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No user ids received for resetting passwords"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ if 'reset_user_password_button' in kwd:
+ message = ''
+ status = ''
+ for user_id in ids:
+ user = get_user( trans, user_id )
+ password = kwd.get( 'password', None )
+ confirm = kwd.get( 'confirm' , None )
+ if len( password ) < 6:
+ message = "Please use a password of at least 6 characters"
+ status = 'error'
+ break
+ elif password != confirm:
+ message = "Passwords do not match"
+ status = 'error'
+ break
+ else:
+ user.set_password_cleartext( password )
+ trans.sa_session.add( user )
+ trans.sa_session.flush()
+ if not message and not status:
+ message = "Passwords reset for %d users" % len( ids )
+ status = 'done'
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status=status ) )
+ users = [ get_user( trans, user_id ) for user_id in ids ]
+ if len( ids ) > 1:
+ id=','.join( id )
+ return trans.fill_template( '/admin/user/reset_password.mako',
+ id=id,
+ users=users,
+ password='',
+ confirm='',
+ webapp=webapp )
+ @web.expose
+ @web.require_admin
+ def mark_user_deleted( self, trans, **kwd ):
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No user ids received for deleting"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Deleted %d users: " % len( ids )
+ for user_id in ids:
+ user = get_user( trans, user_id )
+ user.deleted = True
+ trans.sa_session.add( user )
+ trans.sa_session.flush()
+ message += " %s " % user.email
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def undelete_user( self, trans, **kwd ):
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No user ids received for undeleting"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=message,
+ status='error' ) )
+ ids = util.listify( id )
+ count = 0
+ undeleted_users = ""
+ for user_id in ids:
+ user = get_user( trans, user_id )
+ if not user.deleted:
+ message = "User '%s' has not been deleted, so it cannot be undeleted." % user.email
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ user.deleted = False
+ trans.sa_session.add( user )
+ trans.sa_session.flush()
+ count += 1
+ undeleted_users += " %s" % user.email
+ message = "Undeleted %d users: %s" % ( count, undeleted_users )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def purge_user( self, trans, **kwd ):
+ # This method should only be called for a User that has previously been deleted.
+ # We keep the User in the database ( marked as purged ), and stuff associated
+ # with the user's private role in case we want the ability to unpurge the user
+ # some time in the future.
+ # Purging a deleted User deletes all of the following:
+ # - History where user_id = User.id
+ # - HistoryDatasetAssociation where history_id = History.id
+ # - Dataset where HistoryDatasetAssociation.dataset_id = Dataset.id
+ # - UserGroupAssociation where user_id == User.id
+ # - UserRoleAssociation where user_id == User.id EXCEPT FOR THE PRIVATE ROLE
+ # Purging Histories and Datasets must be handled via the cleanup_datasets.py script
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ id = kwd.get( 'id', None )
+ if not id:
+ message = "No user ids received for purging"
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ ids = util.listify( id )
+ message = "Purged %d users: " % len( ids )
+ for user_id in ids:
+ user = get_user( trans, user_id )
+ if not user.deleted:
+ # We should never reach here, but just in case there is a bug somewhere...
+ message = "User '%s' has not been deleted, so it cannot be purged." % user.email
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ private_role = trans.app.security_agent.get_private_user_role( user )
+ # Delete History
+ for h in user.active_histories:
+ trans.sa_session.refresh( h )
+ for hda in h.active_datasets:
+ # Delete HistoryDatasetAssociation
+ d = trans.sa_session.query( trans.app.model.Dataset ).get( hda.dataset_id )
+ # Delete Dataset
+ if not d.deleted:
+ d.deleted = True
+ trans.sa_session.add( d )
+ hda.deleted = True
+ trans.sa_session.add( hda )
+ h.deleted = True
+ trans.sa_session.add( h )
+ # Delete UserGroupAssociations
+ for uga in user.groups:
+ trans.sa_session.delete( uga )
+ # Delete UserRoleAssociations EXCEPT FOR THE PRIVATE ROLE
+ for ura in user.roles:
+ if ura.role_id != private_role.id:
+ trans.sa_session.delete( ura )
+ # Purge the user
+ user.purged = True
+ trans.sa_session.add( user )
+ trans.sa_session.flush()
+ message += "%s " % user.email
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ @web.expose
+ @web.require_admin
+ def users( self, trans, **kwargs ):
+ if 'operation' in kwargs:
+ operation = kwargs['operation'].lower()
+ if operation == "roles":
+ return self.user( trans, **kwargs )
+ if operation == "reset password":
+ return self.reset_user_password( trans, **kwargs )
+ if operation == "delete":
+ return self.mark_user_deleted( trans, **kwargs )
+ if operation == "undelete":
+ return self.undelete_user( trans, **kwargs )
+ if operation == "purge":
+ return self.purge_user( trans, **kwargs )
+ if operation == "create":
+ return self.create_new_user( trans, **kwargs )
+ if operation == "information":
+ return self.user_info( trans, **kwargs )
+ if operation == "manage roles and groups":
+ return self.manage_roles_and_groups_for_user( trans, **kwargs )
+ # Render the list view
+ return self.user_list_grid( trans, **kwargs )
+ @web.expose
+ @web.require_admin
+ def user_info( self, trans, **kwd ):
+ '''
+ This method displays the user information page which consists of login
+ information, public username, reset password & other user information
+ obtained during registration
+ '''
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ user_id = kwd.get( 'id', None )
+ if not user_id:
+ message += "Invalid user id (%s) received" % str( user_id )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ user = get_user( trans, user_id )
+ return trans.response.send_redirect( web.url_for( controller='user',
+ action='show_info',
+ user_id=user.id,
+ admin_view=True,
+ **kwd ) )
+ @web.expose
+ @web.require_admin
+ def name_autocomplete_data( self, trans, q=None, limit=None, timestamp=None ):
+ """Return autocomplete data for user emails"""
+ ac_data = ""
+ for user in trans.sa_session.query( User ).filter_by( deleted=False ).filter( func.lower( User.email ).like( q.lower() + "%" ) ):
+ ac_data = ac_data + user.email + "\n"
+ return ac_data
+ @web.expose
+ @web.require_admin
+ def manage_roles_and_groups_for_user( self, trans, **kwd ):
+ webapp = kwd.get( 'webapp', 'galaxy' )
+ user_id = kwd.get( 'id', None )
+ message = ''
+ status = ''
+ if not user_id:
+ message += "Invalid user id (%s) received" % str( user_id )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='error' ) )
+ user = get_user( trans, user_id )
+ private_role = trans.app.security_agent.get_private_user_role( user )
+ if kwd.get( 'user_roles_groups_edit_button', False ):
+ # Make sure the user is not dis-associating himself from his private role
+ out_roles = kwd.get( 'out_roles', [] )
+ if out_roles:
+ out_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( out_roles ) ]
+ if private_role in out_roles:
+ message += "You cannot eliminate a user's private role association. "
+ status = 'error'
+ in_roles = kwd.get( 'in_roles', [] )
+ if in_roles:
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( in_roles ) ]
+ out_groups = kwd.get( 'out_groups', [] )
+ if out_groups:
+ out_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( out_groups ) ]
+ in_groups = kwd.get( 'in_groups', [] )
+ if in_groups:
+ in_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( in_groups ) ]
+ if in_roles:
+ trans.app.security_agent.set_entity_user_associations( users=[ user ], roles=in_roles, groups=in_groups )
+ trans.sa_session.refresh( user )
+ message += "User '%s' has been updated with %d associated roles and %d associated groups (private roles are not displayed)" % \
+ ( user.email, len( in_roles ), len( in_groups ) )
+ trans.response.send_redirect( web.url_for( controller='admin',
+ action='users',
+ webapp=webapp,
+ message=util.sanitize_text( message ),
+ status='done' ) )
+ in_roles = []
+ out_roles = []
+ in_groups = []
+ out_groups = []
+ for role in trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name ):
+ if role in [ x.role for x in user.roles ]:
+ in_roles.append( ( role.id, role.name ) )
+ elif role.type != trans.app.model.Role.types.PRIVATE:
+ # There is a 1 to 1 mapping between a user and a PRIVATE role, so private roles should
+ # not be listed in the roles form fields, except for the currently selected user's private
+ # role, which should always be in in_roles. The check above is added as an additional
+ # precaution, since for a period of time we were including private roles in the form fields.
+ out_roles.append( ( role.id, role.name ) )
+ for group in trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name ):
+ if group in [ x.group for x in user.groups ]:
+ in_groups.append( ( group.id, group.name ) )
+ else:
+ out_groups.append( ( group.id, group.name ) )
+ message += "User '%s' is currently associated with %d roles and is a member of %d groups" % \
+ ( user.email, len( in_roles ), len( in_groups ) )
+ if not status:
+ status = 'done'
+ return trans.fill_template( '/admin/user/user.mako',
+ user=user,
+ in_roles=in_roles,
+ out_roles=out_roles,
+ in_groups=in_groups,
+ out_groups=out_groups,
+ webapp=webapp,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def memdump( self, trans, ids = 'None', sorts = 'None', pages = 'None', new_id = None, new_sort = None, **kwd ):
+ if self.app.memdump is None:
+ return trans.show_error_message( "Memdump is not enabled (set <code>use_memdump = True</code> in universe_wsgi.ini)" )
+ heap = self.app.memdump.get()
+ p = util.Params( kwd )
+ msg = None
+ if p.dump:
+ heap = self.app.memdump.get( update = True )
+ msg = "Heap dump complete"
+ elif p.setref:
+ self.app.memdump.setref()
+ msg = "Reference point set (dump to see delta from this point)"
+ ids = ids.split( ',' )
+ sorts = sorts.split( ',' )
+ if new_id is not None:
+ ids.append( new_id )
+ sorts.append( 'None' )
+ elif new_sort is not None:
+ sorts[-1] = new_sort
+ breadcrumb = "<a href='%s' class='breadcrumb'>heap</a>" % web.url_for()
+ # new lists so we can assemble breadcrumb links
+ new_ids = []
+ new_sorts = []
+ for id, sort in zip( ids, sorts ):
+ new_ids.append( id )
+ if id != 'None':
+ breadcrumb += "<a href='%s' class='breadcrumb'>[%s]</a>" % ( web.url_for( ids=','.join( new_ids ), sorts=','.join( new_sorts ) ), id )
+ heap = heap[int(id)]
+ new_sorts.append( sort )
+ if sort != 'None':
+ breadcrumb += "<a href='%s' class='breadcrumb'>.by('%s')</a>" % ( web.url_for( ids=','.join( new_ids ), sorts=','.join( new_sorts ) ), sort )
+ heap = heap.by( sort )
+ ids = ','.join( new_ids )
+ sorts = ','.join( new_sorts )
+ if p.theone:
+ breadcrumb += ".theone"
+ heap = heap.theone
+ return trans.fill_template( '/admin/memdump.mako', heap = heap, ids = ids, sorts = sorts, breadcrumb = breadcrumb, msg = msg )
+
+ @web.expose
+ @web.require_admin
+ def jobs( self, trans, stop = [], stop_msg = None, cutoff = 180, **kwd ):
+ deleted = []
+ msg = None
+ status = None
+ job_ids = util.listify( stop )
+ if job_ids and stop_msg in [ None, '' ]:
+ msg = 'Please enter an error message to display to the user describing why the job was terminated'
+ status = 'error'
+ elif job_ids:
+ if stop_msg[-1] not in string.punctuation:
+ stop_msg += '.'
+ for job_id in job_ids:
+ trans.app.job_manager.job_stop_queue.put( job_id, error_msg="This job was stopped by an administrator: %s For more information or help" % stop_msg )
+ deleted.append( str( job_id ) )
+ if deleted:
+ msg = 'Queued job'
+ if len( deleted ) > 1:
+ msg += 's'
+ msg += ' for deletion: '
+ msg += ', '.join( deleted )
+ status = 'done'
+ cutoff_time = datetime.utcnow() - timedelta( seconds=int( cutoff ) )
+ jobs = trans.sa_session.query( trans.app.model.Job ) \
+ .filter( and_( trans.app.model.Job.table.c.update_time < cutoff_time,
+ or_( trans.app.model.Job.state == trans.app.model.Job.states.NEW,
+ trans.app.model.Job.state == trans.app.model.Job.states.QUEUED,
+ trans.app.model.Job.state == trans.app.model.Job.states.RUNNING,
+ trans.app.model.Job.state == trans.app.model.Job.states.UPLOAD ) ) ) \
+ .order_by( trans.app.model.Job.table.c.update_time.desc() )
+ last_updated = {}
+ for job in jobs:
+ delta = datetime.utcnow() - job.update_time
+ if delta > timedelta( minutes=60 ):
+ last_updated[job.id] = '%s hours' % int( delta.seconds / 60 / 60 )
+ else:
+ last_updated[job.id] = '%s minutes' % int( delta.seconds / 60 )
+ return trans.fill_template( '/admin/jobs.mako',
+ jobs = jobs,
+ last_updated = last_updated,
+ cutoff = cutoff,
+ msg = msg,
+ status = status )
+
+## ---- Utility methods -------------------------------------------------------
+
+def get_user( trans, id ):
+ """Get a User from the database by id."""
+ # Load user from database
+ id = trans.security.decode_id( id )
+ user = trans.sa_session.query( trans.model.User ).get( id )
+ if not user:
+ return trans.show_error_message( "User not found for id (%s)" % str( id ) )
+ return user
+def get_role( trans, id ):
+ """Get a Role from the database by id."""
+ # Load user from database
+ id = trans.security.decode_id( id )
+ role = trans.sa_session.query( trans.model.Role ).get( id )
+ if not role:
+ return trans.show_error_message( "Role not found for id (%s)" % str( id ) )
+ return role
+def get_group( trans, id ):
+ """Get a Group from the database by id."""
+ # Load user from database
+ id = trans.security.decode_id( id )
+ group = trans.sa_session.query( trans.model.Group ).get( id )
+ if not group:
+ return trans.show_error_message( "Group not found for id (%s)" % str( id ) )
+ return group
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/admin.py
--- a/lib/galaxy/web/controllers/admin.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/admin.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,16 +1,10 @@
-import string, sys
-from datetime import datetime, timedelta
-from galaxy import util, datatypes
from galaxy.web.base.controller import *
-from galaxy.util.odict import odict
+from galaxy import model
from galaxy.model.orm import *
from galaxy.web.framework.helpers import time_ago, iff, grids
import logging
log = logging.getLogger( __name__ )
-# States for passing messages
-SUCCESS, INFO, WARNING, ERROR = "done", "info", "warning", "error"
-
class UserListGrid( grids.Grid ):
class EmailColumn( grids.TextColumn ):
def get_value( self, trans, grid, user ):
@@ -49,6 +43,7 @@
return 'never'
# Grid definition
+ webapp = "galaxy"
title = "Users"
model_class = model.User
template='/admin/user/grid.mako'
@@ -57,7 +52,7 @@
EmailColumn( "Email",
key="email",
model_class=model.User,
- link=( lambda item: dict( operation="information", id=item.id ) ),
+ link=( lambda item: dict( operation="information", id=item.id, webapp="galaxy" ) ),
attach_popup=True,
filterable="advanced" ),
UserNameColumn( "User Name",
@@ -79,11 +74,18 @@
visible=False,
filterable="standard" ) )
global_actions = [
- grids.GridAction( "Create new user", dict( controller='admin', action='users', operation='create' ) )
+ grids.GridAction( "Create new user", dict( controller='admin', action='users', operation='create', webapp="galaxy" ) )
]
operations = [
- grids.GridOperation( "Manage Roles and Groups", condition=( lambda item: not item.deleted ), allow_multiple=False ),
- grids.GridOperation( "Reset Password", condition=( lambda item: not item.deleted ), allow_multiple=True, allow_popup=False )
+ grids.GridOperation( "Manage Roles and Groups",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=False,
+ url_args=dict( webapp="galaxy", action="manage_roles_and_groups_for_user" ) ),
+ grids.GridOperation( "Reset Password",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=True,
+ allow_popup=False,
+ url_args=dict( webapp="galaxy", action="reset_user_password" ) )
]
#TODO: enhance to account for trans.app.config.allow_user_deletion here so that we can eliminate these operations if
# the setting is False
@@ -96,7 +98,6 @@
grids.GridColumnFilter( "Purged", args=dict( purged=True ) ),
grids.GridColumnFilter( "All", args=dict( deleted='All' ) )
]
- default_filter = dict( email="All", username="All", deleted="False", purged="False" )
num_rows_per_page = 50
preserve_state = False
use_paging = True
@@ -134,6 +135,7 @@
return 0
# Grid definition
+ webapp = "galaxy"
title = "Roles"
model_class = model.Role
template='/admin/dataset_security/role/grid.mako'
@@ -141,7 +143,7 @@
columns = [
NameColumn( "Name",
key="name",
- link=( lambda item: dict( operation="Manage users and groups", id=item.id ) ),
+ link=( lambda item: dict( operation="Manage users and groups", id=item.id, webapp="galaxy" ) ),
model_class=model.Role,
attach_popup=True,
filterable="advanced" ),
@@ -169,16 +171,27 @@
global_actions = [
grids.GridAction( "Add new role", dict( controller='admin', action='roles', operation='create' ) )
]
- operations = [ grids.GridOperation( "Rename", condition=( lambda item: not item.deleted ), allow_multiple=False ),
- grids.GridOperation( "Delete", condition=( lambda item: not item.deleted ), allow_multiple=True ),
- grids.GridOperation( "Undelete", condition=( lambda item: item.deleted ), allow_multiple=True ),
- grids.GridOperation( "Purge", condition=( lambda item: item.deleted ), allow_multiple=True ) ]
+ operations = [ grids.GridOperation( "Rename",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=False,
+ url_args=dict( webapp="galaxy", action="rename_role" ) ),
+ grids.GridOperation( "Delete",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="galaxy", action="mark_role_deleted" ) ),
+ grids.GridOperation( "Undelete",
+ condition=( lambda item: item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="galaxy", action="undelete_role" ) ),
+ grids.GridOperation( "Purge",
+ condition=( lambda item: item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="galaxy", action="purge_role" ) ) ]
standard_filters = [
grids.GridColumnFilter( "Active", args=dict( deleted=False ) ),
grids.GridColumnFilter( "Deleted", args=dict( deleted=True ) ),
grids.GridColumnFilter( "All", args=dict( deleted='All' ) )
]
- default_filter = dict( name="All", deleted="False", description="All", type="All" )
num_rows_per_page = 50
preserve_state = False
use_paging = True
@@ -210,6 +223,7 @@
return 0
# Grid definition
+ webapp = "galaxy"
title = "Groups"
model_class = model.Group
template='/admin/dataset_security/group/grid.mako'
@@ -217,7 +231,7 @@
columns = [
NameColumn( "Name",
key="name",
- link=( lambda item: dict( operation="Manage users and roles", id=item.id ) ),
+ link=( lambda item: dict( operation="Manage users and roles", id=item.id, webapp="galaxy" ) ),
model_class=model.Group,
attach_popup=True,
filterable="advanced" ),
@@ -233,18 +247,29 @@
visible=False,
filterable="standard" ) )
global_actions = [
- grids.GridAction( "Add new group", dict( controller='admin', action='groups', operation='create' ) )
+ grids.GridAction( "Add new group", dict( controller='admin', action='groups', operation='create', webapp="galaxy" ) )
]
- operations = [ grids.GridOperation( "Rename", condition=( lambda item: not item.deleted ), allow_multiple=False ),
- grids.GridOperation( "Delete", condition=( lambda item: not item.deleted ), allow_multiple=True ),
- grids.GridOperation( "Undelete", condition=( lambda item: item.deleted ), allow_multiple=True ),
- grids.GridOperation( "Purge", condition=( lambda item: item.deleted ), allow_multiple=True ) ]
+ operations = [ grids.GridOperation( "Rename",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=False,
+ url_args=dict( webapp="galaxy", action="rename_group" ) ),
+ grids.GridOperation( "Delete",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="galaxy", action="mark_group_deleted" ) ),
+ grids.GridOperation( "Undelete",
+ condition=( lambda item: item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="galaxy", action="undelete_group" ) ),
+ grids.GridOperation( "Purge",
+ condition=( lambda item: item.deleted ),
+ allow_multiple=True,
+ url_args=dict( webapp="galaxy", action="purge_group" ) ) ]
standard_filters = [
grids.GridColumnFilter( "Active", args=dict( deleted=False ) ),
grids.GridColumnFilter( "Deleted", args=dict( deleted=True ) ),
grids.GridColumnFilter( "All", args=dict( deleted='All' ) )
]
- default_filter = dict( name="All", deleted="False" )
num_rows_per_page = 50
preserve_state = False
use_paging = True
@@ -253,831 +278,8 @@
def build_initial_query( self, session ):
return session.query( self.model_class )
-class Admin( BaseController ):
+class AdminGalaxy( BaseController, Admin ):
user_list_grid = UserListGrid()
role_list_grid = RoleListGrid()
group_list_grid = GroupListGrid()
-
- @web.expose
- @web.require_admin
- def index( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- return trans.fill_template( '/admin/index.mako', message=message, status=status )
- @web.expose
- @web.require_admin
- def center( self, trans, **kwd ):
- return trans.fill_template( '/admin/center.mako' )
- @web.expose
- @web.require_admin
- def reload_tool( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- return trans.fill_template( '/admin/reload_tool.mako', toolbox=self.app.toolbox, message=message, status=status )
- @web.expose
- @web.require_admin
- def tool_reload( self, trans, tool_version=None, **kwd ):
- params = util.Params( kwd )
- tool_id = params.tool_id
- self.app.toolbox.reload( tool_id )
- message = 'Reloaded tool: ' + tool_id
- return trans.fill_template( '/admin/reload_tool.mako', toolbox=self.app.toolbox, message=message, status='done' )
-
- # Galaxy Role Stuff
- @web.expose
- @web.require_admin
- def roles( self, trans, **kwargs ):
- if 'operation' in kwargs:
- operation = kwargs['operation'].lower()
- if operation == "roles":
- return self.role( trans, **kwargs )
- if operation == "create":
- return self.create_role( trans, **kwargs )
- if operation == "delete":
- return self.mark_role_deleted( trans, **kwargs )
- if operation == "undelete":
- return self.undelete_role( trans, **kwargs )
- if operation == "purge":
- return self.purge_role( trans, **kwargs )
- if operation == "manage users and groups":
- return self.manage_users_and_groups_for_role( trans, **kwargs )
- if operation == "rename":
- return self.rename_role( trans, **kwargs )
- # Render the list view
- return self.role_list_grid( trans, **kwargs )
- @web.expose
- @web.require_admin
- def create_role( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- if params.get( 'create_role_button', False ):
- name = util.restore_text( params.name )
- description = util.restore_text( params.description )
- in_users = util.listify( params.get( 'in_users', [] ) )
- in_groups = util.listify( params.get( 'in_groups', [] ) )
- create_group_for_role = params.get( 'create_group_for_role', 'no' )
- if not name or not description:
- message = "Enter a valid name and a description"
- elif trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.name==name ).first():
- message = "A role with that name already exists"
- else:
- # Create the role
- role = trans.app.model.Role( name=name, description=description, type=trans.app.model.Role.types.ADMIN )
- trans.sa_session.add( role )
- # Create the UserRoleAssociations
- for user in [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in in_users ]:
- ura = trans.app.model.UserRoleAssociation( user, role )
- trans.sa_session.add( ura )
- # Create the GroupRoleAssociations
- for group in [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in in_groups ]:
- gra = trans.app.model.GroupRoleAssociation( group, role )
- trans.sa_session.add( gra )
- if create_group_for_role == 'yes':
- # Create the group
- group = trans.app.model.Group( name=name )
- trans.sa_session.add( group )
- message = "Group '%s' has been created, and role '%s' has been created with %d associated users and %d associated groups" % \
- ( group.name, role.name, len( in_users ), len( in_groups ) )
- else:
- message = "Role '%s' has been created with %d associated users and %d associated groups" % ( role.name, len( in_users ), len( in_groups ) )
- trans.sa_session.flush()
- trans.response.send_redirect( web.url_for( controller='admin', action='roles', message=util.sanitize_text( message ), status='done' ) )
- trans.response.send_redirect( web.url_for( controller='admin', action='create_role', message=util.sanitize_text( message ), status='error' ) )
- out_users = []
- for user in trans.sa_session.query( trans.app.model.User ) \
- .filter( trans.app.model.User.table.c.deleted==False ) \
- .order_by( trans.app.model.User.table.c.email ):
- out_users.append( ( user.id, user.email ) )
- out_groups = []
- for group in trans.sa_session.query( trans.app.model.Group ) \
- .filter( trans.app.model.Group.table.c.deleted==False ) \
- .order_by( trans.app.model.Group.table.c.name ):
- out_groups.append( ( group.id, group.name ) )
- return trans.fill_template( '/admin/dataset_security/role/role_create.mako',
- in_users=[],
- out_users=out_users,
- in_groups=[],
- out_groups=out_groups,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def rename_role( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- role = get_role( trans, params.id )
- if params.get( 'rename_role_button', False ):
- old_name = role.name
- new_name = util.restore_text( params.name )
- new_description = util.restore_text( params.description )
- if not new_name:
- message = 'Enter a valid name'
- return trans.fill_template( '/admin/dataset_security/role/role_rename.mako', role=role, message=message, status='error' )
- elif trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.name==new_name ).first():
- message = 'A role with that name already exists'
- return trans.fill_template( '/admin/dataset_security/role/role_rename.mako', role=role, message=message, status='error' )
- else:
- role.name = new_name
- role.description = new_description
- trans.sa_session.add( role )
- trans.sa_session.flush()
- message = "Role '%s' has been renamed to '%s'" % ( old_name, new_name )
- return trans.response.send_redirect( web.url_for( action='roles', message=util.sanitize_text( message ), status='done' ) )
- return trans.fill_template( '/admin/dataset_security/role/role_rename.mako', role=role, message=message, status=status )
- @web.expose
- @web.require_admin
- def manage_users_and_groups_for_role( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- role = get_role( trans, params.id )
- if params.get( 'role_members_edit_button', False ):
- in_users = [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in util.listify( params.in_users ) ]
- for ura in role.users:
- user = trans.sa_session.query( trans.app.model.User ).get( ura.user_id )
- if user not in in_users:
- # Delete DefaultUserPermissions for previously associated users that have been removed from the role
- for dup in user.default_permissions:
- if role == dup.role:
- trans.sa_session.delete( dup )
- # Delete DefaultHistoryPermissions for previously associated users that have been removed from the role
- for history in user.histories:
- for dhp in history.default_permissions:
- if role == dhp.role:
- trans.sa_session.delete( dhp )
- trans.sa_session.flush()
- in_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( params.in_groups ) ]
- trans.app.security_agent.set_entity_role_associations( roles=[ role ], users=in_users, groups=in_groups )
- trans.sa_session.refresh( role )
- message = "Role '%s' has been updated with %d associated users and %d associated groups" % ( role.name, len( in_users ), len( in_groups ) )
- trans.response.send_redirect( web.url_for( action='roles', message=util.sanitize_text( message ), status=status ) )
- in_users = []
- out_users = []
- in_groups = []
- out_groups = []
- for user in trans.sa_session.query( trans.app.model.User ) \
- .filter( trans.app.model.User.table.c.deleted==False ) \
- .order_by( trans.app.model.User.table.c.email ):
- if user in [ x.user for x in role.users ]:
- in_users.append( ( user.id, user.email ) )
- else:
- out_users.append( ( user.id, user.email ) )
- for group in trans.sa_session.query( trans.app.model.Group ) \
- .filter( trans.app.model.Group.table.c.deleted==False ) \
- .order_by( trans.app.model.Group.table.c.name ):
- if group in [ x.group for x in role.groups ]:
- in_groups.append( ( group.id, group.name ) )
- else:
- out_groups.append( ( group.id, group.name ) )
- # Build a list of tuples that are LibraryDatasetDatasetAssociationss followed by a list of actions
- # whose DatasetPermissions is associated with the Role
- # [ ( LibraryDatasetDatasetAssociation [ action, action ] ) ]
- library_dataset_actions = {}
- for dp in role.dataset_actions:
- for ldda in trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ) \
- .filter( trans.app.model.LibraryDatasetDatasetAssociation.dataset_id==dp.dataset_id ):
- root_found = False
- folder_path = ''
- folder = ldda.library_dataset.folder
- while not root_found:
- folder_path = '%s / %s' % ( folder.name, folder_path )
- if not folder.parent:
- root_found = True
- else:
- folder = folder.parent
- folder_path = '%s %s' % ( folder_path, ldda.name )
- library = trans.sa_session.query( trans.app.model.Library ) \
- .filter( trans.app.model.Library.table.c.root_folder_id == folder.id ) \
- .first()
- if library not in library_dataset_actions:
- library_dataset_actions[ library ] = {}
- try:
- library_dataset_actions[ library ][ folder_path ].append( dp.action )
- except:
- library_dataset_actions[ library ][ folder_path ] = [ dp.action ]
- return trans.fill_template( '/admin/dataset_security/role/role.mako',
- role=role,
- in_users=in_users,
- out_users=out_users,
- in_groups=in_groups,
- out_groups=out_groups,
- library_dataset_actions=library_dataset_actions,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def mark_role_deleted( self, trans, **kwd ):
- params = util.Params( kwd )
- role = get_role( trans, params.id )
- role.deleted = True
- trans.sa_session.add( role )
- trans.sa_session.flush()
- message = "Role '%s' has been marked as deleted." % role.name
- trans.response.send_redirect( web.url_for( action='roles', message=util.sanitize_text( message ), status='done' ) )
- @web.expose
- @web.require_admin
- def undelete_role( self, trans, **kwd ):
- params = util.Params( kwd )
- role = get_role( trans, params.id )
- role.deleted = False
- trans.sa_session.add( role )
- trans.sa_session.flush()
- message = "Role '%s' has been marked as not deleted." % role.name
- trans.response.send_redirect( web.url_for( action='roles', message=util.sanitize_text( message ), status='done' ) )
- @web.expose
- @web.require_admin
- def purge_role( self, trans, **kwd ):
- # This method should only be called for a Role that has previously been deleted.
- # Purging a deleted Role deletes all of the following from the database:
- # - UserRoleAssociations where role_id == Role.id
- # - DefaultUserPermissions where role_id == Role.id
- # - DefaultHistoryPermissions where role_id == Role.id
- # - GroupRoleAssociations where role_id == Role.id
- # - DatasetPermissionss where role_id == Role.id
- params = util.Params( kwd )
- role = get_role( trans, params.id )
- if not role.deleted:
- message = "Role '%s' has not been deleted, so it cannot be purged." % role.name
- trans.response.send_redirect( web.url_for( action='roles', message=util.sanitize_text( message ), status='error' ) )
- # Delete UserRoleAssociations
- for ura in role.users:
- user = trans.sa_session.query( trans.app.model.User ).get( ura.user_id )
- # Delete DefaultUserPermissions for associated users
- for dup in user.default_permissions:
- if role == dup.role:
- trans.sa_session.delete( dup )
- # Delete DefaultHistoryPermissions for associated users
- for history in user.histories:
- for dhp in history.default_permissions:
- if role == dhp.role:
- trans.sa_session.delete( dhp )
- trans.sa_session.delete( ura )
- # Delete GroupRoleAssociations
- for gra in role.groups:
- trans.sa_session.delete( gra )
- # Delete DatasetPermissionss
- for dp in role.dataset_actions:
- trans.sa_session.delete( dp )
- trans.sa_session.flush()
- message = "The following have been purged from the database for role '%s': " % role.name
- message += "DefaultUserPermissions, DefaultHistoryPermissions, UserRoleAssociations, GroupRoleAssociations, DatasetPermissionss."
- trans.response.send_redirect( web.url_for( action='roles', message=util.sanitize_text( message ), status='done' ) )
-
- # Galaxy Group Stuff
- @web.expose
- @web.require_admin
- def groups( self, trans, **kwargs ):
- if 'operation' in kwargs:
- operation = kwargs['operation'].lower()
- if operation == "groups":
- return self.group( trans, **kwargs )
- if operation == "create":
- return self.create_group( trans, **kwargs )
- if operation == "delete":
- return self.mark_group_deleted( trans, **kwargs )
- if operation == "undelete":
- return self.undelete_group( trans, **kwargs )
- if operation == "purge":
- return self.purge_group( trans, **kwargs )
- if operation == "manage users and roles":
- return self.manage_users_and_roles_for_group( trans, **kwargs )
- if operation == "rename":
- return self.rename_group( trans, **kwargs )
- # Render the list view
- return self.group_list_grid( trans, **kwargs )
- @web.expose
- @web.require_admin
- def rename_group( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- group = get_group( trans, params.id )
- if params.get( 'rename_group_button', False ):
- old_name = group.name
- new_name = util.restore_text( params.name )
- if not new_name:
- message = 'Enter a valid name'
- return trans.fill_template( '/admin/dataset_security/group/group_rename.mako', group=group, message=message, status='error' )
- elif trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.name==new_name ).first():
- message = 'A group with that name already exists'
- return trans.fill_template( '/admin/dataset_security/group/group_rename.mako', group=group, message=message, status='error' )
- else:
- group.name = new_name
- trans.sa_session.add( group )
- trans.sa_session.flush()
- message = "Group '%s' has been renamed to '%s'" % ( old_name, new_name )
- return trans.response.send_redirect( web.url_for( action='groups', message=util.sanitize_text( message ), status='done' ) )
- return trans.fill_template( '/admin/dataset_security/group/group_rename.mako', group=group, message=message, status=status )
- @web.expose
- @web.require_admin
- def manage_users_and_roles_for_group( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- group = get_group( trans, params.id )
- if params.get( 'group_roles_users_edit_button', False ):
- in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.in_roles ) ]
- in_users = [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in util.listify( params.in_users ) ]
- trans.app.security_agent.set_entity_group_associations( groups=[ group ], roles=in_roles, users=in_users )
- trans.sa_session.refresh( group )
- message += "Group '%s' has been updated with %d associated roles and %d associated users" % ( group.name, len( in_roles ), len( in_users ) )
- trans.response.send_redirect( web.url_for( action='groups', message=util.sanitize_text( message ), status=status ) )
- in_roles = []
- out_roles = []
- in_users = []
- out_users = []
- for role in trans.sa_session.query(trans.app.model.Role ) \
- .filter( trans.app.model.Role.table.c.deleted==False ) \
- .order_by( trans.app.model.Role.table.c.name ):
- if role in [ x.role for x in group.roles ]:
- in_roles.append( ( role.id, role.name ) )
- else:
- out_roles.append( ( role.id, role.name ) )
- for user in trans.sa_session.query( trans.app.model.User ) \
- .filter( trans.app.model.User.table.c.deleted==False ) \
- .order_by( trans.app.model.User.table.c.email ):
- if user in [ x.user for x in group.users ]:
- in_users.append( ( user.id, user.email ) )
- else:
- out_users.append( ( user.id, user.email ) )
- message += 'Group %s is currently associated with %d roles and %d users' % ( group.name, len( in_roles ), len( in_users ) )
- return trans.fill_template( '/admin/dataset_security/group/group.mako',
- group=group,
- in_roles=in_roles,
- out_roles=out_roles,
- in_users=in_users,
- out_users=out_users,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def create_group( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- if params.get( 'create_group_button', False ):
- name = util.restore_text( params.name )
- in_users = util.listify( params.get( 'in_users', [] ) )
- in_roles = util.listify( params.get( 'in_roles', [] ) )
- if not name:
- message = "Enter a valid name"
- elif trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.name==name ).first():
- message = "A group with that name already exists"
- else:
- # Create the group
- group = trans.app.model.Group( name=name )
- trans.sa_session.add( group )
- trans.sa_session.flush()
- # Create the UserRoleAssociations
- for user in [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in in_users ]:
- uga = trans.app.model.UserGroupAssociation( user, group )
- trans.sa_session.add( uga )
- trans.sa_session.flush()
- # Create the GroupRoleAssociations
- for role in [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in in_roles ]:
- gra = trans.app.model.GroupRoleAssociation( group, role )
- trans.sa_session.add( gra )
- trans.sa_session.flush()
- message = "Group '%s' has been created with %d associated users and %d associated roles" % ( name, len( in_users ), len( in_roles ) )
- trans.response.send_redirect( web.url_for( controller='admin', action='groups', message=util.sanitize_text( message ), status='done' ) )
- trans.response.send_redirect( web.url_for( controller='admin', action='create_group', message=util.sanitize_text( message ), status='error' ) )
- out_users = []
- for user in trans.sa_session.query( trans.app.model.User ) \
- .filter( trans.app.model.User.table.c.deleted==False ) \
- .order_by( trans.app.model.User.table.c.email ):
- out_users.append( ( user.id, user.email ) )
- out_roles = []
- for role in trans.sa_session.query( trans.app.model.Role ) \
- .filter( trans.app.model.Role.table.c.deleted==False ) \
- .order_by( trans.app.model.Role.table.c.name ):
- out_roles.append( ( role.id, role.name ) )
- return trans.fill_template( '/admin/dataset_security/group/group_create.mako',
- in_users=[],
- out_users=out_users,
- in_roles=[],
- out_roles=out_roles,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def mark_group_deleted( self, trans, **kwd ):
- params = util.Params( kwd )
- group = get_group( trans, params.id )
- group.deleted = True
- trans.sa_session.add( group )
- trans.sa_session.flush()
- message = "Group '%s' has been marked as deleted." % group.name
- trans.response.send_redirect( web.url_for( action='groups', message=util.sanitize_text( message ), status='done' ) )
- @web.expose
- @web.require_admin
- def undelete_group( self, trans, **kwd ):
- params = util.Params( kwd )
- group = get_group( trans, params.id )
- group.deleted = False
- trans.sa_session.add( group )
- trans.sa_session.flush()
- message = "Group '%s' has been marked as not deleted." % group.name
- trans.response.send_redirect( web.url_for( action='groups', message=util.sanitize_text( message ), status='done' ) )
- @web.expose
- @web.require_admin
- def purge_group( self, trans, **kwd ):
- # This method should only be called for a Group that has previously been deleted.
- # Purging a deleted Group simply deletes all UserGroupAssociations and GroupRoleAssociations.
- params = util.Params( kwd )
- group = get_group( trans, params.id )
- if not group.deleted:
- # We should never reach here, but just in case there is a bug somewhere...
- message = "Group '%s' has not been deleted, so it cannot be purged." % group.name
- trans.response.send_redirect( web.url_for( action='groups', message=util.sanitize_text( message ), status='error' ) )
- # Delete UserGroupAssociations
- for uga in group.users:
- trans.sa_session.delete( uga )
- # Delete GroupRoleAssociations
- for gra in group.roles:
- trans.sa_session.delete( gra )
- trans.sa_session.flush()
- message = "The following have been purged from the database for group '%s': UserGroupAssociations, GroupRoleAssociations." % group.name
- trans.response.send_redirect( web.url_for( action='groups', message=util.sanitize_text( message ), status='done' ) )
-
- # Galaxy User Stuff
- @web.expose
- @web.require_admin
- def create_new_user( self, trans, **kwargs ):
- return trans.response.send_redirect( web.url_for( controller='user',
- action='create',
- admin_view=True ) )
- @web.expose
- @web.require_admin
- def reset_user_password( self, trans, **kwd ):
- id = kwd.get( 'id', None )
- if not id:
- message = "No user ids received for resetting passwords"
- trans.response.send_redirect( web.url_for( action='users', message=message, status='error' ) )
- ids = util.listify( id )
- if 'reset_user_password_button' in kwd:
- message = ''
- status = ''
- for user_id in ids:
- user = get_user( trans, user_id )
- password = kwd.get( 'password', None )
- confirm = kwd.get( 'confirm' , None )
- if len( password ) < 6:
- message = "Please use a password of at least 6 characters"
- status = 'error'
- break
- elif password != confirm:
- message = "Passwords do not match"
- status = 'error'
- break
- else:
- user.set_password_cleartext( password )
- trans.sa_session.add( user )
- trans.sa_session.flush()
- if not message and not status:
- message = "Passwords reset for %d users" % len( ids )
- status = 'done'
- trans.response.send_redirect( web.url_for( action='users',
- message=util.sanitize_text( message ),
- status=status ) )
- users = [ get_user( trans, user_id ) for user_id in ids ]
- if len( ids ) > 1:
- id=','.join( id )
- return trans.fill_template( '/admin/user/reset_password.mako',
- id=id,
- users=users,
- password='',
- confirm='' )
- @web.expose
- @web.require_admin
- def mark_user_deleted( self, trans, **kwd ):
- id = kwd.get( 'id', None )
- if not id:
- message = "No user ids received for deleting"
- trans.response.send_redirect( web.url_for( action='users', message=message, status='error' ) )
- ids = util.listify( id )
- message = "Deleted %d users: " % len( ids )
- for user_id in ids:
- user = get_user( trans, user_id )
- user.deleted = True
- trans.sa_session.add( user )
- trans.sa_session.flush()
- message += " %s " % user.email
- trans.response.send_redirect( web.url_for( action='users', message=util.sanitize_text( message ), status='done' ) )
- @web.expose
- @web.require_admin
- def undelete_user( self, trans, **kwd ):
- id = kwd.get( 'id', None )
- if not id:
- message = "No user ids received for undeleting"
- trans.response.send_redirect( web.url_for( action='users', message=message, status='error' ) )
- ids = util.listify( id )
- count = 0
- undeleted_users = ""
- for user_id in ids:
- user = get_user( trans, user_id )
- if user.deleted:
- user.deleted = False
- trans.sa_session.add( user )
- trans.sa_session.flush()
- count += 1
- undeleted_users += " %s" % user.email
- message = "Undeleted %d users: %s" % ( count, undeleted_users )
- trans.response.send_redirect( web.url_for( action='users',
- message=util.sanitize_text( message ),
- status='done' ) )
- @web.expose
- @web.require_admin
- def purge_user( self, trans, **kwd ):
- # This method should only be called for a User that has previously been deleted.
- # We keep the User in the database ( marked as purged ), and stuff associated
- # with the user's private role in case we want the ability to unpurge the user
- # some time in the future.
- # Purging a deleted User deletes all of the following:
- # - History where user_id = User.id
- # - HistoryDatasetAssociation where history_id = History.id
- # - Dataset where HistoryDatasetAssociation.dataset_id = Dataset.id
- # - UserGroupAssociation where user_id == User.id
- # - UserRoleAssociation where user_id == User.id EXCEPT FOR THE PRIVATE ROLE
- # Purging Histories and Datasets must be handled via the cleanup_datasets.py script
- id = kwd.get( 'id', None )
- if not id:
- message = "No user ids received for purging"
- trans.response.send_redirect( web.url_for( action='users',
- message=util.sanitize_text( message ),
- status='error' ) )
- ids = util.listify( id )
- message = "Purged %d users: " % len( ids )
- for user_id in ids:
- user = get_user( trans, user_id )
- if not user.deleted:
- # We should never reach here, but just in case there is a bug somewhere...
- message = "User '%s' has not been deleted, so it cannot be purged." % user.email
- trans.response.send_redirect( web.url_for( action='users',
- message=util.sanitize_text( message ),
- status='error' ) )
- private_role = trans.app.security_agent.get_private_user_role( user )
- # Delete History
- for h in user.active_histories:
- trans.sa_session.refresh( h )
- for hda in h.active_datasets:
- # Delete HistoryDatasetAssociation
- d = trans.sa_session.query( trans.app.model.Dataset ).get( hda.dataset_id )
- # Delete Dataset
- if not d.deleted:
- d.deleted = True
- trans.sa_session.add( d )
- hda.deleted = True
- trans.sa_session.add( hda )
- h.deleted = True
- trans.sa_session.add( h )
- # Delete UserGroupAssociations
- for uga in user.groups:
- trans.sa_session.delete( uga )
- # Delete UserRoleAssociations EXCEPT FOR THE PRIVATE ROLE
- for ura in user.roles:
- if ura.role_id != private_role.id:
- trans.sa_session.delete( ura )
- # Purge the user
- user.purged = True
- trans.sa_session.add( user )
- trans.sa_session.flush()
- message += "%s " % user.email
- trans.response.send_redirect( web.url_for( controller='admin',
- action='users',
- message=util.sanitize_text( message ),
- status='done' ) )
- @web.expose
- @web.require_admin
- def users( self, trans, **kwargs ):
- if 'operation' in kwargs:
- operation = kwargs['operation'].lower()
- if operation == "roles":
- return self.user( trans, **kwargs )
- if operation == "reset password":
- return self.reset_user_password( trans, **kwargs )
- if operation == "delete":
- return self.mark_user_deleted( trans, **kwargs )
- if operation == "undelete":
- return self.undelete_user( trans, **kwargs )
- if operation == "purge":
- return self.purge_user( trans, **kwargs )
- if operation == "create":
- return self.create_new_user( trans, **kwargs )
- if operation == "information":
- return self.user_info( trans, **kwargs )
- if operation == "manage roles and groups":
- return self.manage_roles_and_groups_for_user( trans, **kwargs )
- # Render the list view
- return self.user_list_grid( trans, **kwargs )
- @web.expose
- @web.require_admin
- def user_info( self, trans, **kwd ):
- '''
- This method displays the user information page which consists of login
- information, public username, reset password & other user information
- obtained during registration
- '''
- user_id = kwd.get( 'id', None )
- if not user_id:
- message += "Invalid user id (%s) received" % str( user_id )
- trans.response.send_redirect( web.url_for( controller='admin',
- action='users',
- message=util.sanitize_text( message ),
- status='error' ) )
- user = get_user( trans, user_id )
- return trans.response.send_redirect( web.url_for( controller='user',
- action='show_info',
- user_id=user.id,
- admin_view=True,
- **kwd ) )
- @web.expose
- @web.require_admin
- def name_autocomplete_data( self, trans, q=None, limit=None, timestamp=None ):
- """Return autocomplete data for user emails"""
- ac_data = ""
- for user in trans.sa_session.query( User ).filter_by( deleted=False ).filter( func.lower( User.email ).like( q.lower() + "%" ) ):
- ac_data = ac_data + user.email + "\n"
- return ac_data
- @web.expose
- @web.require_admin
- def manage_roles_and_groups_for_user( self, trans, **kwd ):
- user_id = kwd.get( 'id', None )
- message = ''
- status = ''
- if not user_id:
- message += "Invalid user id (%s) received" % str( user_id )
- trans.response.send_redirect( web.url_for( controller='admin',
- action='users',
- message=util.sanitize_text( message ),
- status='error' ) )
- user = get_user( trans, user_id )
- private_role = trans.app.security_agent.get_private_user_role( user )
- if kwd.get( 'user_roles_groups_edit_button', False ):
- # Make sure the user is not dis-associating himself from his private role
- out_roles = kwd.get( 'out_roles', [] )
- if out_roles:
- out_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( out_roles ) ]
- if private_role in out_roles:
- message += "You cannot eliminate a user's private role association. "
- status = 'error'
- in_roles = kwd.get( 'in_roles', [] )
- if in_roles:
- in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( in_roles ) ]
- out_groups = kwd.get( 'out_groups', [] )
- if out_groups:
- out_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( out_groups ) ]
- in_groups = kwd.get( 'in_groups', [] )
- if in_groups:
- in_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( in_groups ) ]
- if in_roles:
- trans.app.security_agent.set_entity_user_associations( users=[ user ], roles=in_roles, groups=in_groups )
- trans.sa_session.refresh( user )
- message += "User '%s' has been updated with %d associated roles and %d associated groups (private roles are not displayed)" % \
- ( user.email, len( in_roles ), len( in_groups ) )
- trans.response.send_redirect( web.url_for( action='users',
- message=util.sanitize_text( message ),
- status='done' ) )
- in_roles = []
- out_roles = []
- in_groups = []
- out_groups = []
- for role in trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.deleted==False ) \
- .order_by( trans.app.model.Role.table.c.name ):
- if role in [ x.role for x in user.roles ]:
- in_roles.append( ( role.id, role.name ) )
- elif role.type != trans.app.model.Role.types.PRIVATE:
- # There is a 1 to 1 mapping between a user and a PRIVATE role, so private roles should
- # not be listed in the roles form fields, except for the currently selected user's private
- # role, which should always be in in_roles. The check above is added as an additional
- # precaution, since for a period of time we were including private roles in the form fields.
- out_roles.append( ( role.id, role.name ) )
- for group in trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.deleted==False ) \
- .order_by( trans.app.model.Group.table.c.name ):
- if group in [ x.group for x in user.groups ]:
- in_groups.append( ( group.id, group.name ) )
- else:
- out_groups.append( ( group.id, group.name ) )
- message += "User '%s' is currently associated with %d roles and is a member of %d groups" % \
- ( user.email, len( in_roles ), len( in_groups ) )
- if not status:
- status = 'done'
- return trans.fill_template( '/admin/user/user.mako',
- user=user,
- in_roles=in_roles,
- out_roles=out_roles,
- in_groups=in_groups,
- out_groups=out_groups,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def memdump( self, trans, ids = 'None', sorts = 'None', pages = 'None', new_id = None, new_sort = None, **kwd ):
- if self.app.memdump is None:
- return trans.show_error_message( "Memdump is not enabled (set <code>use_memdump = True</code> in universe_wsgi.ini)" )
- heap = self.app.memdump.get()
- p = util.Params( kwd )
- msg = None
- if p.dump:
- heap = self.app.memdump.get( update = True )
- msg = "Heap dump complete"
- elif p.setref:
- self.app.memdump.setref()
- msg = "Reference point set (dump to see delta from this point)"
- ids = ids.split( ',' )
- sorts = sorts.split( ',' )
- if new_id is not None:
- ids.append( new_id )
- sorts.append( 'None' )
- elif new_sort is not None:
- sorts[-1] = new_sort
- breadcrumb = "<a href='%s' class='breadcrumb'>heap</a>" % web.url_for()
- # new lists so we can assemble breadcrumb links
- new_ids = []
- new_sorts = []
- for id, sort in zip( ids, sorts ):
- new_ids.append( id )
- if id != 'None':
- breadcrumb += "<a href='%s' class='breadcrumb'>[%s]</a>" % ( web.url_for( ids=','.join( new_ids ), sorts=','.join( new_sorts ) ), id )
- heap = heap[int(id)]
- new_sorts.append( sort )
- if sort != 'None':
- breadcrumb += "<a href='%s' class='breadcrumb'>.by('%s')</a>" % ( web.url_for( ids=','.join( new_ids ), sorts=','.join( new_sorts ) ), sort )
- heap = heap.by( sort )
- ids = ','.join( new_ids )
- sorts = ','.join( new_sorts )
- if p.theone:
- breadcrumb += ".theone"
- heap = heap.theone
- return trans.fill_template( '/admin/memdump.mako', heap = heap, ids = ids, sorts = sorts, breadcrumb = breadcrumb, msg = msg )
-
- @web.expose
- @web.require_admin
- def jobs( self, trans, stop = [], stop_msg = None, cutoff = 180, **kwd ):
- deleted = []
- msg = None
- status = None
- job_ids = util.listify( stop )
- if job_ids and stop_msg in [ None, '' ]:
- msg = 'Please enter an error message to display to the user describing why the job was terminated'
- status = 'error'
- elif job_ids:
- if stop_msg[-1] not in string.punctuation:
- stop_msg += '.'
- for job_id in job_ids:
- trans.app.job_manager.job_stop_queue.put( job_id, error_msg="This job was stopped by an administrator: %s For more information or help" % stop_msg )
- deleted.append( str( job_id ) )
- if deleted:
- msg = 'Queued job'
- if len( deleted ) > 1:
- msg += 's'
- msg += ' for deletion: '
- msg += ', '.join( deleted )
- status = 'done'
- cutoff_time = datetime.utcnow() - timedelta( seconds=int( cutoff ) )
- jobs = trans.sa_session.query( trans.app.model.Job ) \
- .filter( and_( trans.app.model.Job.table.c.update_time < cutoff_time,
- or_( trans.app.model.Job.state == trans.app.model.Job.states.NEW,
- trans.app.model.Job.state == trans.app.model.Job.states.QUEUED,
- trans.app.model.Job.state == trans.app.model.Job.states.RUNNING,
- trans.app.model.Job.state == trans.app.model.Job.states.UPLOAD ) ) ) \
- .order_by( trans.app.model.Job.table.c.update_time.desc() )
- last_updated = {}
- for job in jobs:
- delta = datetime.utcnow() - job.update_time
- if delta > timedelta( minutes=60 ):
- last_updated[job.id] = '%s hours' % int( delta.seconds / 60 / 60 )
- else:
- last_updated[job.id] = '%s minutes' % int( delta.seconds / 60 )
- return trans.fill_template( '/admin/jobs.mako', jobs = jobs, last_updated = last_updated, cutoff = cutoff, msg = msg, status = status )
-
-## ---- Utility methods -------------------------------------------------------
-
-def get_user( trans, id ):
- """Get a User from the database by id."""
- # Load user from database
- id = trans.security.decode_id( id )
- user = trans.sa_session.query( model.User ).get( id )
- if not user:
- return trans.show_error_message( "User not found for id (%s)" % str( id ) )
- return user
-def get_role( trans, id ):
- """Get a Role from the database by id."""
- # Load user from database
- id = trans.security.decode_id( id )
- role = trans.sa_session.query( model.Role ).get( id )
- if not role:
- return trans.show_error_message( "Role not found for id (%s)" % str( id ) )
- return role
-def get_group( trans, id ):
- """Get a Group from the database by id."""
- # Load user from database
- id = trans.security.decode_id( id )
- group = trans.sa_session.query( model.Group ).get( id )
- if not group:
- return trans.show_error_message( "Group not found for id (%s)" % str( id ) )
- return group
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/dataset.py
--- a/lib/galaxy/web/controllers/dataset.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/dataset.py Wed Apr 21 11:35:21 2010 -0400
@@ -468,7 +468,7 @@
dataset = self.get_dataset( trans, slug, False, True )
if dataset:
truncated, dataset_data = self.get_data( dataset, preview )
- dataset.annotation = self.get_item_annotation_str( trans.sa_session, dataset.history.user, dataset )
+ dataset.annotation = self.get_item_annotation_str( trans, dataset.history.user, dataset )
return trans.fill_template_mako( "/dataset/display.mako", item=dataset, item_data=dataset_data, truncated=truncated )
else:
raise web.httpexceptions.HTTPNotFound()
@@ -482,7 +482,7 @@
raise web.httpexceptions.HTTPNotFound()
truncated, dataset_data = self.get_data( dataset, preview=True )
# Get annotation.
- dataset.annotation = self.get_item_annotation_str( trans.sa_session, trans.get_user(), dataset )
+ dataset.annotation = self.get_item_annotation_str( trans, trans.user, dataset )
return trans.stream_template_mako( "/dataset/item_content.mako", item=dataset, item_data=dataset_data, truncated=truncated )
@web.expose
@@ -502,7 +502,7 @@
dataset = self.get_dataset( trans, id, False, True )
if not dataset:
web.httpexceptions.HTTPNotFound()
- return self.get_item_annotation_str( trans.sa_session, trans.get_user(), dataset )
+ return self.get_item_annotation_str( trans, trans.user, dataset )
@web.expose
def display_at( self, trans, dataset_id, filename=None, **kwd ):
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/forms.py
--- a/lib/galaxy/web/controllers/forms.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/forms.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,7 +1,7 @@
from galaxy.web.base.controller import *
from galaxy.model.orm import *
from galaxy.datatypes import sniff
-from galaxy import util
+from galaxy import model, util
import logging, os, sys
from galaxy.web.form_builder import *
from galaxy.tools.parameters.basic import parameter_types
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/history.py
--- a/lib/galaxy/web/controllers/history.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/history.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,6 +1,6 @@
from galaxy.web.base.controller import *
from galaxy.web.framework.helpers import time_ago, iff, grids
-from galaxy import util
+from galaxy import model, util
from galaxy.util.odict import odict
from galaxy.model.mapping import desc
from galaxy.model.orm import *
@@ -490,9 +490,9 @@
# Get datasets.
datasets = self.get_history_datasets( trans, history )
# Get annotations.
- history.annotation = self.get_item_annotation_str( trans.sa_session, history.user, history )
+ history.annotation = self.get_item_annotation_str( trans, history.user, history )
for dataset in datasets:
- dataset.annotation = self.get_item_annotation_str( trans.sa_session, history.user, dataset )
+ dataset.annotation = self.get_item_annotation_str( trans, history.user, dataset )
return trans.stream_template_mako( "/history/item_content.mako", item = history, item_data = datasets )
@web.expose
@@ -613,9 +613,9 @@
# Get datasets.
datasets = self.get_history_datasets( trans, history )
# Get annotations.
- history.annotation = self.get_item_annotation_str( trans.sa_session, history.user, history )
+ history.annotation = self.get_item_annotation_str( trans, history.user, history )
for dataset in datasets:
- dataset.annotation = self.get_item_annotation_str( trans.sa_session, history.user, dataset )
+ dataset.annotation = self.get_item_annotation_str( trans, history.user, dataset )
return trans.stream_template_mako( "history/display.mako",
item = history, item_data = datasets )
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/library_admin.py
--- a/lib/galaxy/web/controllers/library_admin.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/library_admin.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,5 +1,5 @@
import sys
-from galaxy import util
+from galaxy import model, util
from galaxy.web.base.controller import *
from galaxy.web.framework.helpers import time_ago, iff, grids
from galaxy.model.orm import *
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/page.py
--- a/lib/galaxy/web/controllers/page.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/page.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,3 +1,4 @@
+from galaxy import model
from galaxy.web.base.controller import *
from galaxy.web.framework.helpers import time_ago, grids
from galaxy.util.sanitize_html import sanitize_html, _BaseHTMLProcessor
@@ -406,7 +407,7 @@
else:
page_title = page.title
page_slug = page.slug
- page_annotation = self.get_item_annotation_str( trans.sa_session, trans.get_user(), page )
+ page_annotation = self.get_item_annotation_str( trans, trans.user, page )
if not page_annotation:
page_annotation = ""
return trans.show_form(
@@ -527,7 +528,7 @@
annotations = from_json_string( annotations )
for annotation_dict in annotations:
item_id = trans.security.decode_id( annotation_dict[ 'item_id' ] )
- item_class = self.get_class( annotation_dict[ 'item_class' ] )
+ item_class = self.get_class( trans, annotation_dict[ 'item_class' ] )
item = trans.sa_session.query( item_class ).filter_by( id=item_id ).first()
if not item:
raise RuntimeError( "cannot find annotated item" )
@@ -693,28 +694,28 @@
def _get_embed_html( self, trans, item_class, item_id ):
""" Returns HTML for embedding an item in a page. """
- item_class = self.get_class( item_class )
+ item_class = self.get_class( trans, item_class )
if item_class == model.History:
history = self.get_history( trans, item_id, False, True )
- history.annotation = self.get_item_annotation_str( trans.sa_session, history.user, history )
+ history.annotation = self.get_item_annotation_str( trans, history.user, history )
if history:
datasets = self.get_history_datasets( trans, history )
return trans.fill_template( "history/embed.mako", item=history, item_data=datasets )
elif item_class == model.HistoryDatasetAssociation:
dataset = self.get_dataset( trans, item_id, False, True )
- dataset.annotation = self.get_item_annotation_str( trans.sa_session, dataset.history.user, dataset )
+ dataset.annotation = self.get_item_annotation_str( trans, dataset.history.user, dataset )
if dataset:
data = self.get_data( dataset )
return trans.fill_template( "dataset/embed.mako", item=dataset, item_data=data )
elif item_class == model.StoredWorkflow:
workflow = self.get_stored_workflow( trans, item_id, False, True )
- workflow.annotation = self.get_item_annotation_str( trans.sa_session, workflow.user, workflow )
+ workflow.annotation = self.get_item_annotation_str( trans, workflow.user, workflow )
if workflow:
self.get_stored_workflow_steps( trans, workflow )
return trans.fill_template( "workflow/embed.mako", item=workflow, item_data=workflow.latest_workflow.steps )
elif item_class == model.Visualization:
visualization = self.get_visualization( trans, item_id, False, True )
- visualization.annotation = self.get_item_annotation_str( trans.sa_session, visualization.user, visualization )
+ visualization.annotation = self.get_item_annotation_str( trans, visualization.user, visualization )
if visualization:
return trans.fill_template( "visualization/embed.mako", item=visualization, item_data=None )
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/requests.py
--- a/lib/galaxy/web/controllers/requests.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/requests.py Wed Apr 21 11:35:21 2010 -0400
@@ -2,7 +2,7 @@
from galaxy.web.framework.helpers import time_ago, iff, grids
from galaxy.model.orm import *
from galaxy.datatypes import sniff
-from galaxy import util
+from galaxy import model, util
from galaxy.util.streamball import StreamBall
from galaxy.util.odict import odict
import logging, tempfile, zipfile, tarfile, os, sys
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/requests_admin.py
--- a/lib/galaxy/web/controllers/requests_admin.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/requests_admin.py Wed Apr 21 11:35:21 2010 -0400
@@ -2,7 +2,7 @@
from galaxy.web.framework.helpers import time_ago, iff, grids
from galaxy.model.orm import *
from galaxy.datatypes import sniff
-from galaxy import util
+from galaxy import model, util
from galaxy.util.streamball import StreamBall
import logging, tempfile, zipfile, tarfile, os, sys, subprocess
from galaxy.web.form_builder import *
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/root.py
--- a/lib/galaxy/web/controllers/root.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/root.py Wed Apr 21 11:35:21 2010 -0400
@@ -72,7 +72,7 @@
datasets = self.get_history_datasets( trans, history, show_deleted )
return trans.stream_template_mako( "root/history.mako",
history = history,
- annotation = self.get_item_annotation_str( trans.sa_session, trans.get_user(), history ),
+ annotation = self.get_item_annotation_str( trans, trans.user, history ),
datasets = datasets,
hda_id = hda_id,
show_deleted = show_deleted )
@@ -368,7 +368,7 @@
status = 'done'
return trans.fill_template( "/dataset/edit_attributes.mako",
data=data,
- data_annotation=self.get_item_annotation_str( trans.sa_session, trans.get_user(), data ),
+ data_annotation=self.get_item_annotation_str( trans, trans.user, data ),
datatypes=ldatatypes,
current_user_roles=current_user_roles,
all_roles=all_roles,
@@ -392,7 +392,7 @@
if data.parent_id is None and len( data.creating_job_associations ) > 0:
# Mark associated job for deletion
job = data.creating_job_associations[0].job
- if job.state in [ model.Job.states.QUEUED, model.Job.states.RUNNING, model.Job.states.NEW ]:
+ if job.state in [ self.app.model.Job.states.QUEUED, self.app.model.Job.states.RUNNING, self.app.model.Job.states.NEW ]:
# Are *all* of the job's other output datasets deleted?
if job.check_if_output_datasets_deleted():
job.mark_deleted()
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/tag.py
--- a/lib/galaxy/web/controllers/tag.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/tag.py Wed Apr 21 11:35:21 2010 -0400
@@ -72,7 +72,7 @@
if item_id is not None:
item = self._get_item( trans, item_class, trans.security.decode_id( item_id ) )
user = trans.user
- item_class = self.get_class( item_class )
+ item_class = self.get_class( trans, item_class )
q = q.encode( 'utf-8' )
if q.find( ":" ) == -1:
return self._get_tag_autocomplete_names( trans, q, limit, timestamp, user, item, item_class )
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/tracks.py
--- a/lib/galaxy/web/controllers/tracks.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/tracks.py Wed Apr 21 11:35:21 2010 -0400
@@ -16,6 +16,7 @@
import math, re, logging, glob
log = logging.getLogger(__name__)
+from galaxy import model
from galaxy.util.json import to_json_string, from_json_string
from galaxy.web.base.controller import *
from galaxy.web.framework import simplejson
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/user.py
--- a/lib/galaxy/web/controllers/user.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/user.py Wed Apr 21 11:35:21 2010 -0400
@@ -103,8 +103,9 @@
status='done',
active_view="user" )
@web.expose
- def create( self, trans, webapp='galaxy', redirect_url='', refresh_frames=[], **kwd ):
+ def create( self, trans, redirect_url='', refresh_frames=[], **kwd ):
params = util.Params( kwd )
+ webapp = params.get( 'webapp', 'galaxy' )
use_panels = util.string_as_bool( kwd.get( 'use_panels', True ) )
email = util.restore_text( params.get( 'email', '' ) )
# Do not sanitize passwords, so take from kwd
@@ -165,7 +166,7 @@
action='users',
message='Created new user account (%s)' % user.email,
status='done' ) )
- else:
+ elif not admin_view:
# Must be logging into the community space webapp
trans.handle_user_login( user, webapp )
if not error:
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/visualization.py
--- a/lib/galaxy/web/controllers/visualization.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/visualization.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,3 +1,4 @@
+from galaxy import model
from galaxy.web.base.controller import *
from galaxy.web.framework.helpers import time_ago, grids, iff
from galaxy.util.sanitize_html import sanitize_html
@@ -366,7 +367,7 @@
if visualization.slug is None:
self.create_item_slug( trans.sa_session, visualization )
visualization_slug = visualization.slug
- visualization_annotation = self.get_item_annotation_str( trans.sa_session, trans.get_user(), visualization )
+ visualization_annotation = self.get_item_annotation_str( trans, trans.user, visualization )
if not visualization_annotation:
visualization_annotation = ""
return trans.show_form(
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/controllers/workflow.py
--- a/lib/galaxy/web/controllers/workflow.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/controllers/workflow.py Wed Apr 21 11:35:21 2010 -0400
@@ -14,6 +14,7 @@
from galaxy.util.sanitize_html import sanitize_html
from galaxy.util.topsort import topsort, topsort_levels, CycleError
from galaxy.workflow.modules import *
+from galaxy import model
from galaxy.model.mapping import desc
from galaxy.model.orm import *
@@ -176,9 +177,9 @@
# Get data for workflow's steps.
self.get_stored_workflow_steps( trans, stored_workflow )
# Get annotations.
- stored_workflow.annotation = self.get_item_annotation_str( trans.sa_session, stored_workflow.user, stored_workflow )
+ stored_workflow.annotation = self.get_item_annotation_str( trans, stored_workflow.user, stored_workflow )
for step in stored_workflow.latest_workflow.steps:
- step.annotation = self.get_item_annotation_str( trans.sa_session, stored_workflow.user, step )
+ step.annotation = self.get_item_annotation_str( trans, stored_workflow.user, step )
return trans.fill_template_mako( "workflow/display.mako", item=stored_workflow, item_data=stored_workflow.latest_workflow.steps )
@web.expose
@@ -192,9 +193,9 @@
# Get data for workflow's steps.
self.get_stored_workflow_steps( trans, stored )
# Get annotations.
- stored.annotation = self.get_item_annotation_str( trans.sa_session, stored.user, stored )
+ stored.annotation = self.get_item_annotation_str( trans, stored.user, stored )
for step in stored.latest_workflow.steps:
- step.annotation = self.get_item_annotation_str( trans.sa_session, stored.user, step )
+ step.annotation = self.get_item_annotation_str( trans, stored.user, step )
return trans.stream_template_mako( "/workflow/item_content.mako", item = stored, item_data = stored.latest_workflow.steps )
@web.expose
@@ -330,7 +331,7 @@
return trans.fill_template( 'workflow/edit_attributes.mako',
stored=stored,
- annotation=self.get_item_annotation_str( trans.sa_session, trans.get_user(), stored )
+ annotation=self.get_item_annotation_str( trans, trans.user, stored )
)
@web.expose
@@ -501,7 +502,7 @@
if not id:
error( "Invalid workflow id" )
stored = self.get_stored_workflow( trans, id )
- return trans.fill_template( "workflow/editor.mako", stored=stored, annotation=self.get_item_annotation_str( trans.sa_session, trans.get_user(), stored ) )
+ return trans.fill_template( "workflow/editor.mako", stored=stored, annotation=self.get_item_annotation_str( trans, trans.user, stored ) )
@web.json
def editor_form_post( self, trans, type='tool', tool_id=None, annotation=None, **incoming ):
@@ -580,7 +581,7 @@
# as a dictionary not just the values
data['upgrade_messages'][step.order_index] = upgrade_message.values()
# Get user annotation.
- step_annotation = self.get_item_annotation_obj ( trans.sa_session, trans.get_user(), step )
+ step_annotation = self.get_item_annotation_obj ( trans, trans.user, step )
annotation_str = ""
if step_annotation:
annotation_str = step_annotation.annotation
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/web/framework/helpers/grids.py
--- a/lib/galaxy/web/framework/helpers/grids.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/web/framework/helpers/grids.py Wed Apr 21 11:35:21 2010 -0400
@@ -1,7 +1,5 @@
-from galaxy.model import *
from galaxy.model.orm import *
-
-from galaxy.web.base import controller
+from galaxy.web.base.controller import *
from galaxy.web.framework.helpers import iff
from galaxy.web import url_for
from galaxy.util.json import from_json_string, to_json_string
@@ -15,6 +13,7 @@
"""
Specifies the content and format of a grid (data table).
"""
+ webapp = None
title = ""
exposed = True
model_class = None
@@ -43,6 +42,7 @@
self.has_multiple_item_operations = True
break
def __call__( self, trans, **kwargs ):
+ webapp = kwargs.get( 'webapp', 'galaxy' )
status = kwargs.get( 'status', None )
message = kwargs.get( 'message', None )
session = trans.sa_session
@@ -193,7 +193,8 @@
params = cur_filter_dict.copy()
params['sort'] = sort_key
params['async'] = ( 'async' in kwargs )
- trans.log_action( trans.get_user(), unicode( "grid.view"), context, params )
+ params['webapp'] = webapp
+ trans.log_action( trans.get_user(), unicode( "grid.view" ), context, params )
# Render grid.
def url( *args, **kwargs ):
# Only include sort/filter arguments if not linking to another
@@ -214,8 +215,8 @@
else:
new_kwargs[ 'id' ] = trans.security.encode_id( id )
return url_for( **new_kwargs )
- use_panels = ( 'use_panels' in kwargs ) and ( kwargs['use_panels'] == True )
- async_request = ( ( self.use_async ) and ( 'async' in kwargs ) and ( kwargs['async'] in [ 'True', 'true'] ) )
+ use_panels = ( 'use_panels' in kwargs ) and ( kwargs['use_panels'] in [ True, 'True', 'true' ] )
+ async_request = ( ( self.use_async ) and ( 'async' in kwargs ) and ( kwargs['async'] in [ True, 'True', 'true'] ) )
return trans.fill_template( iff( async_request, self.async_template, self.template),
grid=self,
query=query,
@@ -232,6 +233,7 @@
message_type = status,
message = message,
use_panels=use_panels,
+ webapp=self.webapp,
# Pass back kwargs so that grid template can set and use args without
# grid explicitly having to pass them.
kwargs=kwargs )
@@ -333,7 +335,7 @@
model_class_key_field = getattr( self.model_class, self.key )
return func.lower( model_class_key_field ).like( "%" + a_filter.lower() + "%" )
-class OwnerAnnotationColumn( TextColumn, controller.UsesAnnotations ):
+class OwnerAnnotationColumn( TextColumn, UsesAnnotations ):
""" Column that displays and filters item owner's annotations. """
def __init__( self, col_name, key, model_class, model_annotation_association_class, filterable ):
GridColumn.__init__( self, col_name, key=key, model_class=model_class, filterable=filterable )
@@ -341,7 +343,7 @@
self.model_annotation_association_class = model_annotation_association_class
def get_value( self, trans, grid, item ):
""" Returns item annotation. """
- annotation = self.get_item_annotation_str( trans.sa_session, item.user, item )
+ annotation = self.get_item_annotation_str( trans, item.user, item )
return iff( annotation, annotation, "" )
def get_single_filter( self, user, a_filter ):
""" Filter by annotation and annotation owner. """
@@ -515,7 +517,8 @@
return accepted_filters
class GridOperation( object ):
- def __init__( self, label, key=None, condition=None, allow_multiple=True, allow_popup=True, target=None, url_args=None, async_compatible=False, confirm=None ):
+ def __init__( self, label, key=None, condition=None, allow_multiple=True, allow_popup=True,
+ target=None, url_args=None, async_compatible=False, confirm=None ):
self.label = label
self.key = key
self.allow_multiple = allow_multiple
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/webapps/community/base/controller.py
--- a/lib/galaxy/webapps/community/base/controller.py Wed Apr 21 10:41:30 2010 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-"""Contains functionality needed in every webapp interface"""
-import os, time, logging
-# Pieces of Galaxy to make global in every controller
-from galaxy import config, tools, web, util
-from galaxy.web import error, form, url_for
-from galaxy.webapps.community import model
-from galaxy.model.orm import *
-
-from Cheetah.Template import Template
-
-log = logging.getLogger( __name__ )
-
-class BaseController( object ):
- """Base class for Galaxy webapp application controllers."""
- def __init__( self, app ):
- """Initialize an interface for application 'app'"""
- self.app = app
- def get_class( self, class_name ):
- """ Returns the class object that a string denotes. Without this method, we'd have to do eval(<class_name>). """
- if class_name == 'Tool':
- item_class = model.Tool
- else:
- item_class = None
- return item_class
\ No newline at end of file
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/webapps/community/buildapp.py
--- a/lib/galaxy/webapps/community/buildapp.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/webapps/community/buildapp.py Wed Apr 21 11:35:21 2010 -0400
@@ -25,7 +25,8 @@
Search for controllers in the 'galaxy.webapps.controllers' module and add
them to the webapp.
"""
- from galaxy.webapps.community.base.controller import BaseController
+ from galaxy.web.base.controller import BaseController
+ from galaxy.web.base.controller import ControllerUnavailable
import galaxy.webapps.community.controllers
controller_dir = galaxy.webapps.community.controllers.__path__[0]
for fname in os.listdir( controller_dir ):
@@ -40,12 +41,11 @@
T = getattr( module, key )
if isclass( T ) and T is not BaseController and issubclass( T, BaseController ):
webapp.add_controller( name, T( app ) )
- from galaxy.web.base.controller import BaseController
import galaxy.web.controllers
controller_dir = galaxy.web.controllers.__path__[0]
for fname in os.listdir( controller_dir ):
# TODO: fix this if we decide to use, we don't need to inspect all controllers...
- if fname.startswith( 'user' ) and fname.endswith( ".py" ):
+ if fname.startswith( 'user' ) and fname.endswith( ".py" ):
name = fname[:-3]
module_name = "galaxy.web.controllers." + name
module = __import__( module_name )
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/webapps/community/config.py
--- a/lib/galaxy/webapps/community/config.py Wed Apr 21 10:41:30 2010 -0400
+++ b/lib/galaxy/webapps/community/config.py Wed Apr 21 11:35:21 2010 -0400
@@ -79,15 +79,12 @@
for path in self.root, self.file_path, self.template_path:
if not os.path.isdir( path ):
raise ConfigurationError("Directory does not exist: %s" % path )
- def is_admin_user( self,user ):
+ def is_admin_user( self, user ):
"""
Determine if the provided user is listed in `admin_users`.
-
- NOTE: This is temporary, admin users will likely be specified in the
- database in the future.
"""
admin_users = self.get( "admin_users", "" ).split( "," )
- return ( user is not None and user.email in admin_users )
+ return user is not None and user.email in admin_users
def get_database_engine_options( kwargs ):
"""
diff -r 076f572d7c9d -r d6fddb034db7 lib/galaxy/webapps/community/controllers/admin.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/webapps/community/controllers/admin.py Wed Apr 21 11:35:21 2010 -0400
@@ -0,0 +1,285 @@
+from galaxy.web.base.controller import *
+#from galaxy.web.controllers.admin import get_user, get_group, get_role
+from galaxy.webapps.community import model
+from galaxy.model.orm import *
+from galaxy.web.framework.helpers import time_ago, iff, grids
+import logging
+log = logging.getLogger( __name__ )
+
+class UserListGrid( grids.Grid ):
+ class EmailColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, user ):
+ return user.email
+ class UserNameColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, user ):
+ if user.username:
+ return user.username
+ return 'not set'
+ class StatusColumn( grids.GridColumn ):
+ def get_value( self, trans, grid, user ):
+ if user.purged:
+ return "purged"
+ elif user.deleted:
+ return "deleted"
+ return ""
+ class GroupsColumn( grids.GridColumn ):
+ def get_value( self, trans, grid, user ):
+ if user.groups:
+ return len( user.groups )
+ return 0
+ class RolesColumn( grids.GridColumn ):
+ def get_value( self, trans, grid, user ):
+ if user.roles:
+ return len( user.roles )
+ return 0
+ class ExternalColumn( grids.GridColumn ):
+ def get_value( self, trans, grid, user ):
+ if user.external:
+ return 'yes'
+ return 'no'
+ class LastLoginColumn( grids.GridColumn ):
+ def get_value( self, trans, grid, user ):
+ if user.galaxy_sessions:
+ return self.format( user.galaxy_sessions[ 0 ].update_time )
+ return 'never'
+
+ log.debug("####In UserListGrid, in community" )
+ # Grid definition
+ webapp = "community"
+ title = "Users"
+ model_class = model.User
+ template='/admin/user/grid.mako'
+ default_sort_key = "email"
+ columns = [
+ EmailColumn( "Email",
+ key="email",
+ model_class=model.User,
+ link=( lambda item: dict( operation="information", id=item.id, webapp="community" ) ),
+ attach_popup=True,
+ filterable="advanced" ),
+ UserNameColumn( "User Name",
+ key="username",
+ model_class=model.User,
+ attach_popup=False,
+ filterable="advanced" ),
+ GroupsColumn( "Groups", attach_popup=False ),
+ RolesColumn( "Roles", attach_popup=False ),
+ ExternalColumn( "External", attach_popup=False ),
+ LastLoginColumn( "Last Login", format=time_ago ),
+ StatusColumn( "Status", attach_popup=False ),
+ # Columns that are valid for filtering but are not visible.
+ grids.DeletedColumn( "Deleted", key="deleted", visible=False, filterable="advanced" )
+ ]
+ columns.append( grids.MulticolFilterColumn( "Search",
+ cols_to_filter=[ columns[0], columns[1] ],
+ key="free-text-search",
+ visible=False,
+ filterable="standard" ) )
+ global_actions = [
+ grids.GridAction( "Create new user",
+ dict( controller='admin', action='users', operation='create', webapp="community" ) )
+ ]
+ operations = [
+ grids.GridOperation( "Manage Roles and Groups",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=False,
+ url_args=dict( webapp="community", action="manage_roles_and_groups_for_user" ) ),
+ grids.GridOperation( "Reset Password",
+ condition=( lambda item: not item.deleted ),
+ allow_multiple=True,
+ allow_popup=False,
+ url_args=dict( webapp="community", action="reset_user_password" ) )
+ ]
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/076f572d7c9d
changeset: 3674:076f572d7c9d
user: rc
date: Wed Apr 21 10:41:30 2010 -0400
description:
lims:
- data transfer now uses rabbitmq
- datasets can now be renamed before transfering from the sequencer
- data transfer code refactored
diffstat:
lib/galaxy/config.py | 4 +
lib/galaxy/model/__init__.py | 28 +-
lib/galaxy/web/controllers/requests_admin.py | 121 +++++++-
lib/galaxy/web/framework/__init__.py | 1 +
run_galaxy_listener.sh | 2 +-
scripts/galaxy_messaging/client/amqp_publisher.py | 4 +-
scripts/galaxy_messaging/server/amqp_consumer.py | 66 +++-
scripts/galaxy_messaging/server/data_transfer.py | 241 +++++++++-------
scripts/galaxy_messaging/server/galaxydb_interface.py | 17 +-
scripts/galaxy_messaging/server/galaxyweb_interface.py | 132 +++++++++
templates/admin/requests/dataset.mako | 71 +++++
templates/admin/requests/get_data.mako | 67 ++-
universe_wsgi.ini.sample | 2 +-
13 files changed, 577 insertions(+), 179 deletions(-)
diffs (1118 lines):
diff -r 207d0d70483b -r 076f572d7c9d lib/galaxy/config.py
--- a/lib/galaxy/config.py Tue Apr 20 15:36:03 2010 -0400
+++ b/lib/galaxy/config.py Wed Apr 21 10:41:30 2010 -0400
@@ -123,6 +123,10 @@
self.enable_cloud_execution = string_as_bool( kwargs.get( 'enable_cloud_execution', 'True' ) )
else:
self.enable_cloud_execution = string_as_bool( kwargs.get( 'enable_cloud_execution', 'False' ) )
+ # Galaxy messaging (AMQP) configuration options
+ self.amqp = {}
+ for k, v in global_conf_parser.items("galaxy_amqp"):
+ self.amqp[k] = v
def get( self, key, default ):
return self.config_dict.get( key, default )
def get_bool( self, key, default ):
diff -r 207d0d70483b -r 076f572d7c9d lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py Tue Apr 20 15:36:03 2010 -0400
+++ b/lib/galaxy/model/__init__.py Wed Apr 21 10:41:30 2010 -0400
@@ -18,6 +18,7 @@
import logging
log = logging.getLogger( __name__ )
from sqlalchemy.orm import object_session
+import pexpect
datatypes_registry = galaxy.datatypes.registry.Registry() #Default Value Required for unit tests
@@ -1455,7 +1456,9 @@
class Sample( object ):
transfer_status = Bunch( NOT_STARTED = 'Not started',
- IN_PROGRESS = 'In progress',
+ IN_QUEUE = 'In queue',
+ TRANSFERRING = 'Transferring dataset',
+ ADD_TO_LIBRARY = 'Adding to data library',
COMPLETE = 'Complete',
ERROR = 'Error')
def __init__(self, name=None, desc=None, request=None, form_values=None,
@@ -1474,22 +1477,33 @@
return None
def untransferred_dataset_files(self):
count = 0
- for df, status in self.dataset_files:
- if status == self.transfer_status.NOT_STARTED:
+ for df in self.dataset_files:
+ if df['status'] == self.transfer_status.NOT_STARTED:
count = count + 1
return count
def inprogress_dataset_files(self):
count = 0
- for df, status in self.dataset_files:
- if status == self.transfer_status.IN_PROGRESS:
+ for df in self.dataset_files:
+ if df['status'] not in [self.transfer_status.NOT_STARTED, self.transfer_status.COMPLETE]:
count = count + 1
return count
def transferred_dataset_files(self):
count = 0
- for df, status in self.dataset_files:
- if status == self.transfer_status.COMPLETE:
+ for df in self.dataset_files:
+ if df['status'] == self.transfer_status.COMPLETE:
count = count + 1
return count
+ def dataset_size(self, filepath):
+ def print_ticks(d):
+ pass
+ datatx_info = self.request.type.datatx_info
+ cmd = 'ssh %s@%s "du -sh %s"' % ( datatx_info['username'],
+ datatx_info['host'],
+ filepath)
+ output = pexpect.run(cmd, events={'.ssword:*': datatx_info['password']+'\r\n',
+ pexpect.TIMEOUT:print_ticks},
+ timeout=10)
+ return output.split('\t')[0]
class SampleState( object ):
def __init__(self, name=None, desc=None, request_type=None):
diff -r 207d0d70483b -r 076f572d7c9d lib/galaxy/web/controllers/requests_admin.py
--- a/lib/galaxy/web/controllers/requests_admin.py Tue Apr 20 15:36:03 2010 -0400
+++ b/lib/galaxy/web/controllers/requests_admin.py Wed Apr 21 10:41:30 2010 -0400
@@ -12,6 +12,7 @@
from sqlalchemy.sql import select
import pexpect
import ConfigParser, threading, time
+from amqplib import client_0_8 as amqp
log = logging.getLogger( __name__ )
@@ -64,7 +65,6 @@
.filter( self.event_class.table.c.id.in_(select(columns=[func.max(self.event_class.table.c.id)],
from_obj=self.event_class.table,
group_by=self.event_class.table.c.request_id)))
- #print column_filter, q
return q
def get_accepted_filters( self ):
""" Returns a list of accepted filters for this column. """
@@ -1509,8 +1509,11 @@
params = util.Params( kwd )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
- folder_path = util.restore_text( params.get( 'folder_path', '' ) )
+ folder_path = util.restore_text( params.get( 'folder_path',
+ sample.request.type.datatx_info['data_dir'] ) )
files_list = util.listify( params.get( 'files_list', '' ) )
+ if params.get( 'start_transfer_button', False ) == 'True':
+ return self.__start_datatx(trans, sample)
if not folder_path:
return trans.fill_template( '/admin/requests/get_data.mako',
sample=sample, files=[],
@@ -1544,32 +1547,43 @@
dataset_files=sample.dataset_files,
folder_path=folder_path )
elif params.get( 'remove_dataset_button', False ):
+ # get the filenames from the remote host
+ files = self.__get_files(trans, sample, folder_path)
dataset_index = int(params.get( 'dataset_index', 0 ))
del sample.dataset_files[dataset_index]
trans.sa_session.add( sample )
trans.sa_session.flush()
return trans.fill_template( '/admin/requests/get_data.mako',
- sample=sample,
- dataset_files=sample.dataset_files)
- elif params.get( 'start_transfer_button', False ):
+ sample=sample, files=files,
+ dataset_files=sample.dataset_files,
+ folder_path=folder_path)
+ elif params.get( 'select_files_button', False ):
folder_files = []
if len(files_list):
for f in files_list:
+ filepath = os.path.join(folder_path, f)
if f[-1] == os.sep:
# the selected item is a folder so transfer all the
# folder contents
- self.__get_files_in_dir(trans, sample, os.path.join(folder_path, f))
+ self.__get_files_in_dir(trans, sample, filepath)
else:
- sample.dataset_files.append([os.path.join(folder_path, f),
- sample.transfer_status.NOT_STARTED])
+ sample.dataset_files.append(dict(filepath=filepath,
+ status=sample.transfer_status.NOT_STARTED,
+ name=filepath.split('/')[-1],
+ error_msg='',
+ size=sample.dataset_size(filepath)))
trans.sa_session.add( sample )
trans.sa_session.flush()
- return self.__start_datatx(trans, sample)
return trans.response.send_redirect( web.url_for( controller='requests_admin',
action='show_datatx_page',
sample_id=trans.security.encode_id(sample.id),
folder_path=folder_path))
+ return trans.response.send_redirect( web.url_for( controller='requests_admin',
+ action='show_datatx_page',
+ sample_id=trans.security.encode_id(sample.id),
+ folder_path=folder_path))
+
def __setup_datatx_user(self, trans, library, folder):
'''
This method sets up the datatx user:
@@ -1620,7 +1634,62 @@
trans.sa_session.add( dp )
trans.sa_session.flush()
return datatx_user
-
+
+ def __send_message(self, trans, datatx_info, sample):
+ '''
+ This method creates the xml message and sends it to the rabbitmq server
+ '''
+ # first create the xml message based on the following template
+ xml = \
+ ''' <data_transfer>
+ <data_host>%(DATA_HOST)s</data_host>
+ <data_user>%(DATA_USER)s</data_user>
+ <data_password>%(DATA_PASSWORD)s</data_password>
+ <sample_id>%(SAMPLE_ID)s</sample_id>
+ <library_id>%(LIBRARY_ID)s</library_id>
+ <folder_id>%(FOLDER_ID)s</folder_id>
+ %(DATASETS)s
+ </data_transfer>'''
+ dataset_xml = \
+ '''<dataset>
+ <index>%(INDEX)s</index>
+ <name>%(NAME)s</name>
+ <file>%(FILE)s</file>
+ </dataset>'''
+ datasets = ''
+ for index, dataset in enumerate(sample.dataset_files):
+ if dataset['status'] == sample.transfer_status.NOT_STARTED:
+ datasets = datasets + dataset_xml % dict(INDEX=str(index),
+ NAME=dataset['name'],
+ FILE=dataset['filepath'])
+ sample.dataset_files[index]['status'] = sample.transfer_status.IN_QUEUE
+
+ trans.sa_session.add( sample )
+ trans.sa_session.flush()
+ data = xml % dict(DATA_HOST=datatx_info['host'],
+ DATA_USER=datatx_info['username'],
+ DATA_PASSWORD=datatx_info['password'],
+ SAMPLE_ID=str(sample.id),
+ LIBRARY_ID=str(sample.library.id),
+ FOLDER_ID=str(sample.folder.id),
+ DATASETS=datasets)
+ # now send this message
+ conn = amqp.Connection(host=trans.app.config.amqp['host']+":"+trans.app.config.amqp['port'],
+ userid=trans.app.config.amqp['userid'],
+ password=trans.app.config.amqp['password'],
+ virtual_host=trans.app.config.amqp['virtual_host'],
+ insist=False)
+ chan = conn.channel()
+ msg = amqp.Message(data,
+ content_type='text/plain',
+ application_headers={'msg_type': 'data_transfer'})
+ msg.properties["delivery_mode"] = 2
+ chan.basic_publish(msg,
+ exchange=trans.app.config.amqp['exchange'],
+ routing_key=trans.app.config.amqp['routing_key'])
+ chan.close()
+ conn.close()
+
def __start_datatx(self, trans, sample):
# data transfer user
datatx_user = self.__setup_datatx_user(trans, sample.library, sample.folder)
@@ -1635,6 +1704,11 @@
sample_id=trans.security.encode_id(sample.id),
status='error',
message=message))
+ self.__send_message(trans, datatx_info, sample)
+ return trans.response.send_redirect( web.url_for( controller='requests_admin',
+ action='show_datatx_page',
+ sample_id=trans.security.encode_id(sample.id),
+ folder_path=datatx_info['data_dir']))
error_message = ''
transfer_script = "scripts/galaxy_messaging/server/data_transfer.py"
for index, dataset in enumerate(sample.dataset_files):
@@ -1670,6 +1744,33 @@
action='show_datatx_page',
sample_id=trans.security.encode_id(sample.id),
folder_path=os.path.dirname(dfile)))
+
+ @web.expose
+ @web.require_admin
+ def dataset_details( self, trans, **kwd ):
+ try:
+ sample = trans.sa_session.query( trans.app.model.Sample ).get( trans.security.decode_id(kwd['sample_id']) )
+ except:
+ return trans.response.send_redirect( web.url_for( controller='requests_admin',
+ action='list',
+ status='error',
+ message="Invalid sample ID" ) )
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ dataset_index = int( params.get( 'dataset_index', '' ) )
+ if params.get('save', '') == 'Save':
+ sample.dataset_files[dataset_index]['name'] = util.restore_text( params.get( 'name',
+ sample.dataset_files[dataset_index]['name'] ) )
+ trans.sa_session.add( sample )
+ trans.sa_session.flush()
+ status = 'done'
+ message = 'Saved the changes made to the dataset.'
+ return trans.fill_template( '/admin/requests/dataset.mako',
+ sample=sample,
+ dataset_index=dataset_index,
+ message=message,
+ status=status)
##
#### Request Type Stuff ###################################################
##
diff -r 207d0d70483b -r 076f572d7c9d lib/galaxy/web/framework/__init__.py
--- a/lib/galaxy/web/framework/__init__.py Tue Apr 20 15:36:03 2010 -0400
+++ b/lib/galaxy/web/framework/__init__.py Wed Apr 21 10:41:30 2010 -0400
@@ -32,6 +32,7 @@
from sqlalchemy import and_
pkg_resources.require( "pexpect" )
+pkg_resources.require( "amqplib" )
import logging
log = logging.getLogger( __name__ )
diff -r 207d0d70483b -r 076f572d7c9d run_galaxy_listener.sh
--- a/run_galaxy_listener.sh Tue Apr 20 15:36:03 2010 -0400
+++ b/run_galaxy_listener.sh Wed Apr 21 10:41:30 2010 -0400
@@ -1,4 +1,4 @@
#!/bin/sh
cd `dirname $0`
-python scripts/galaxy_messaging/server/amqp_consumer.py universe_wsgi.ini >> galaxy_listener.log 2>&1
\ No newline at end of file
+python scripts/galaxy_messaging/server/amqp_consumer.py universe_wsgi.ini 2>&1
\ No newline at end of file
diff -r 207d0d70483b -r 076f572d7c9d scripts/galaxy_messaging/client/amqp_publisher.py
--- a/scripts/galaxy_messaging/client/amqp_publisher.py Tue Apr 20 15:36:03 2010 -0400
+++ b/scripts/galaxy_messaging/client/amqp_publisher.py Wed Apr 21 10:41:30 2010 -0400
@@ -35,7 +35,9 @@
virtual_host=amqp_config['virtual_host'],
insist=False)
chan = conn.channel()
- msg = amqp.Message(data)
+ msg = amqp.Message(data,
+ content_type='text/plain',
+ application_headers={'msg_type': 'sample_state_update'})
msg.properties["delivery_mode"] = 2
chan.basic_publish(msg,
exchange=amqp_config['exchange'],
diff -r 207d0d70483b -r 076f572d7c9d scripts/galaxy_messaging/server/amqp_consumer.py
--- a/scripts/galaxy_messaging/server/amqp_consumer.py Tue Apr 20 15:36:03 2010 -0400
+++ b/scripts/galaxy_messaging/server/amqp_consumer.py Wed Apr 21 10:41:30 2010 -0400
@@ -13,6 +13,7 @@
import sys, os
import optparse
import xml.dom.minidom
+import subprocess
from galaxydb_interface import GalaxyDbInterface
assert sys.version_info[:2] >= ( 2, 4 )
@@ -27,8 +28,13 @@
from amqplib import client_0_8 as amqp
import logging
-logging.basicConfig(level=logging.DEBUG)
-log = logging.getLogger( 'GalaxyAMQP' )
+log = logging.getLogger("GalaxyAMQP")
+log.setLevel(logging.DEBUG)
+fh = logging.FileHandler("galaxy_listener.log")
+fh.setLevel(logging.DEBUG)
+formatter = logging.Formatter("%(asctime)s - %(name)s - %(message)s")
+fh.setFormatter(formatter)
+log.addHandler(fh)
global dbconnstr
@@ -43,19 +49,47 @@
rc = rc + node.data
return rc
+def get_value_index(dom, tag_name, index):
+ '''
+ This method extracts the tag value from the xml message
+ '''
+ try:
+ nodelist = dom.getElementsByTagName(tag_name)[index].childNodes
+ except:
+ return None
+ rc = ""
+ for node in nodelist:
+ if node.nodeType == node.TEXT_NODE:
+ rc = rc + node.data
+ return rc
+
def recv_callback(msg):
- dom = xml.dom.minidom.parseString(msg.body)
- barcode = get_value(dom, 'barcode')
- state = get_value(dom, 'state')
- log.debug('Barcode: '+barcode)
- log.debug('State: '+state)
- # update the galaxy db
- galaxy = GalaxyDbInterface(dbconnstr)
- sample_id = galaxy.get_sample_id(field_name='bar_code', value=barcode)
- if sample_id == -1:
- log.debug('Invalid barcode.')
- return
- galaxy.change_state(sample_id, state)
+ # check the meesage type.
+ msg_type = msg.properties['application_headers'].get('msg_type')
+ log.debug('\nMESSAGE RECVD: '+str(msg_type))
+ if msg_type == 'data_transfer':
+ log.debug('DATA TRANSFER')
+ # fork a new process to transfer datasets
+ transfer_script = "scripts/galaxy_messaging/server/data_transfer.py"
+ cmd = ( "python",
+ transfer_script,
+ msg.body )
+ pid = subprocess.Popen(cmd).pid
+ log.debug('Started process (%i): %s' % (pid, str(cmd)))
+ elif msg_type == 'sample_state_update':
+ log.debug('SAMPLE STATE UPDATE')
+ dom = xml.dom.minidom.parseString(msg.body)
+ barcode = get_value(dom, 'barcode')
+ state = get_value(dom, 'state')
+ log.debug('Barcode: '+barcode)
+ log.debug('State: '+state)
+ # update the galaxy db
+ galaxy = GalaxyDbInterface(dbconnstr)
+ sample_id = galaxy.get_sample_id(field_name='bar_code', value=barcode)
+ if sample_id == -1:
+ log.debug('Invalid barcode.')
+ return
+ galaxy.change_state(sample_id, state)
def main():
if len(sys.argv) < 2:
@@ -66,8 +100,8 @@
global dbconnstr
dbconnstr = config.get("app:main", "database_connection")
amqp_config = {}
- for option in config.options("galaxy:amqp"):
- amqp_config[option] = config.get("galaxy:amqp", option)
+ for option in config.options("galaxy_amqp"):
+ amqp_config[option] = config.get("galaxy_amqp", option)
log.debug(str(amqp_config))
conn = amqp.Connection(host=amqp_config['host']+":"+amqp_config['port'],
userid=amqp_config['userid'],
diff -r 207d0d70483b -r 076f572d7c9d scripts/galaxy_messaging/server/data_transfer.py
--- a/scripts/galaxy_messaging/server/data_transfer.py Tue Apr 20 15:36:03 2010 -0400
+++ b/scripts/galaxy_messaging/server/data_transfer.py Wed Apr 21 10:41:30 2010 -0400
@@ -8,28 +8,36 @@
Usage:
-python data_transfer.py <sequencer_host>
- <username>
- <password>
- <source_file>
- <sample_id>
- <dataset_index>
- <library_id>
- <folder_id>
+python data_transfer.py <data_transfer_xml>
+
+
"""
import ConfigParser
import sys, os, time, traceback
import optparse
import urllib,urllib2, cookielib, shutil
import logging, time
+import xml.dom.minidom
+
+sp = sys.path[0]
+
from galaxydb_interface import GalaxyDbInterface
assert sys.version_info[:2] >= ( 2, 4 )
+new_path = [ sp ]
+new_path.extend( sys.path )
+sys.path = new_path
+
+from galaxyweb_interface import GalaxyWebInterface
+
+assert sys.version_info[:2] >= ( 2, 4 )
new_path = [ os.path.join( os.getcwd(), "lib" ) ]
new_path.extend( sys.path[1:] ) # remove scripts/ from the path
sys.path = new_path
+
from galaxy.util.json import from_json_string, to_json_string
+from galaxy.model import Sample
from galaxy import eggs
import pkg_resources
pkg_resources.require( "pexpect" )
@@ -38,28 +46,39 @@
pkg_resources.require( "simplejson" )
import simplejson
-logging.basicConfig(filename=sys.stderr, level=logging.DEBUG,
- format="%(asctime)s [%(levelname)s] %(message)s")
-
-class DataTransferException(Exception):
- def __init__(self, value):
- self.msg = value
- def __str__(self):
- return repr(self.msg)
+log = logging.getLogger("datatx_"+str(os.getpid()))
+log.setLevel(logging.DEBUG)
+fh = logging.FileHandler("data_transfer.log")
+fh.setLevel(logging.DEBUG)
+formatter = logging.Formatter("%(asctime)s - %(name)s - %(message)s")
+fh.setFormatter(formatter)
+log.addHandler(fh)
class DataTransfer(object):
- def __init__(self, host, username, password, remote_file, sample_id,
- dataset_index, library_id, folder_id):
- self.host = host
- self.username = username
- self.password = password
- self.remote_file = remote_file
- self.sample_id = sample_id
- self.dataset_index = dataset_index
- self.library_id = library_id
- self.folder_id = folder_id
+ def __init__(self, msg):
+ log.info(msg)
+ self.dom = xml.dom.minidom.parseString(msg)
+ self.host = self.get_value(self.dom, 'data_host')
+ self.username = self.get_value(self.dom, 'data_user')
+ self.password = self.get_value(self.dom, 'data_password')
+ self.sample_id = self.get_value(self.dom, 'sample_id')
+ self.library_id = self.get_value(self.dom, 'library_id')
+ self.folder_id = self.get_value(self.dom, 'folder_id')
+ self.dataset_files = []
+ count=0
+ while True:
+ index = self.get_value_index(self.dom, 'index', count)
+ file = self.get_value_index(self.dom, 'file', count)
+ name = self.get_value_index(self.dom, 'name', count)
+ if file:
+ self.dataset_files.append(dict(name=name,
+ index=int(index),
+ file=file))
+ else:
+ break
+ count=count+1
try:
# Retrieve the upload user login information from the config file
config = ConfigParser.ConfigParser()
@@ -75,11 +94,13 @@
os.mkdir(self.server_dir)
if not os.path.exists(self.server_dir):
raise Exception
+ # connect to db
+ self.galaxydb = GalaxyDbInterface(self.database_connection)
except:
- logging.error(traceback.format_exc())
- logging.error('FATAL ERROR')
+ log.error(traceback.format_exc())
+ log.error('FATAL ERROR')
if self.database_connection:
- self.update_status('Error')
+ self.error_and_exit('Error')
sys.exit(1)
def start(self):
@@ -88,13 +109,13 @@
to the data library & finally updates the data transfer status in the db
'''
# datatx
- self.transfer_file()
+ self.transfer_files()
# add the dataset to the given library
self.add_to_library()
# update the data transfer status in the db
- self.update_status('Complete')
+ self.update_status(Sample.transfer_status.COMPLETE)
# cleanup
- self.cleanup()
+ #self.cleanup()
sys.exit(0)
def cleanup(self):
@@ -114,34 +135,39 @@
This method is called any exception is raised. This prints the traceback
and terminates this script
'''
- logging.error(traceback.format_exc())
- logging.error('FATAL ERROR.'+msg)
- self.update_status('Error.'+msg)
+ log.error(traceback.format_exc())
+ log.error('FATAL ERROR.'+msg)
+ self.update_status('Error.', 'All', msg)
sys.exit(1)
- def transfer_file(self):
+ def transfer_files(self):
'''
This method executes a scp process using pexpect library to transfer
the dataset file from the remote sequencer to the Galaxy server
'''
def print_ticks(d):
pass
- try:
- cmd = "scp %s@%s:%s %s" % ( self.username,
- self.host,
- self.remote_file,
- self.server_dir)
- logging.debug(cmd)
- output = pexpect.run(cmd, events={'.ssword:*': self.password+'\r\n',
- pexpect.TIMEOUT:print_ticks},
- timeout=10)
- logging.debug(output)
- if not os.path.exists(os.path.join(self.server_dir, os.path.basename(self.remote_file))):
- raise DataTransferException('Could not find the local file after transfer (%s)' % os.path.join(self.server_dir, os.path.basename(self.remote_file)))
- except DataTransferException, (e):
- self.error_and_exit(e.msg)
- except:
- self.error_and_exit()
+ for i, df in enumerate(self.dataset_files):
+ self.update_status(Sample.transfer_status.TRANSFERRING, df['index'])
+ try:
+ cmd = "scp %s@%s:%s %s/%s" % ( self.username,
+ self.host,
+ df['file'],
+ self.server_dir,
+ df['name'])
+ log.debug(cmd)
+ output = pexpect.run(cmd, events={'.ssword:*': self.password+'\r\n',
+ pexpect.TIMEOUT:print_ticks},
+ timeout=10)
+ log.debug(output)
+ path = os.path.join(self.server_dir, os.path.basename(df['file']))
+ if not os.path.exists(path):
+ msg = 'Could not find the local file after transfer (%s)' % path
+ log.error(msg)
+ raise Exception(msg)
+ except Exception, e:
+ msg = traceback.format_exc()
+ self.update_status('Error', df['index'], msg)
def add_to_library(self):
@@ -149,73 +175,72 @@
This method adds the dataset file to the target data library & folder
by opening the corresponding url in Galaxy server running.
'''
- try:
- logging.debug('Adding %s to library...' % os.path.basename(self.remote_file))
- # create url
- base_url = "http://%s:%s" % (self.server_host, self.server_port)
- # login
- url = "%s/user/login?email=%s&password=%s" % (base_url, self.datatx_email, self.datatx_password)
- cj = cookielib.CookieJar()
- opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
- f = opener.open(url)
- if f.read().find("ogged in as "+self.datatx_email) == -1:
- # if the user doesnt exist, create the user
- url = "%s/user/create?email=%s&username=%s&password=%s&confirm=%s&create_user_button=Submit" % ( base_url, self.datatx_email, self.datatx_email, self.datatx_password, self.datatx_password )
- f = opener.open(url)
- if f.read().find("ogged in as "+self.datatx_email) == -1:
- raise DataTransferException("The "+self.datatx_email+" user could not login to Galaxy")
- # after login, add dataset to the library
- params = urllib.urlencode(dict( cntrller='library_admin',
- tool_id='upload1',
- tool_state='None',
- library_id=self.library_id,
- folder_id=self.folder_id,
- upload_option='upload_directory',
- file_type='auto',
- server_dir=os.path.basename(self.server_dir),
- dbkey='',
- runtool_btn='Upload to library'))
- #url = "http://localhost:8080/library_common/upload_library_dataset?cntrller=librar…"
- #url = base_url+"/library_common/upload_library_dataset?library_id=adb5f5c93f827949&tool_id=upload1&file_type=auto&server_dir=datatx_22858&dbkey=%3F&upload_option=upload_directory&folder_id=529fd61ab1c6cc36&cntrller=library_admin&tool_state=None&runtool_btn=Upload+to+library"
- url = base_url+"/library_common/upload_library_dataset"
- logging.debug(url)
- logging.debug(params)
- f = opener.open(url, params)
- if f.read().find("Data Library") == -1:
- raise DataTransferException("Dataset could not be uploaded to the data library")
- # finally logout
- f = opener.open(base_url+'/user/logout')
- if f.read().find("You have been logged out.") == -1:
- raise DataTransferException("The "+self.datatx_email+" user could not logout of Galaxy")
- except DataTransferException, (e):
- self.error_and_exit(e.msg)
- except:
- self.error_and_exit()
+ self.update_status(Sample.transfer_status.ADD_TO_LIBRARY)
+ galaxyweb = GalaxyWebInterface(self.server_host, self.server_port,
+ self.datatx_email, self.datatx_password)
+ galaxyweb.add_to_library(self.server_dir, self.library_id, self.folder_id)
+ galaxyweb.logout()
- def update_status(self, status):
+ def update_status(self, status, dataset_index='All', msg=''):
'''
Update the data transfer status for this dataset in the database
'''
try:
- galaxy = GalaxyDbInterface(self.database_connection)
- df = from_json_string(galaxy.get_sample_dataset_files(self.sample_id))
- logging.debug(df)
- df[self.dataset_index][1] = status
- galaxy.set_sample_dataset_files(self.sample_id, to_json_string(df))
- logging.debug("######################\n"+str(from_json_string(galaxy.get_sample_dataset_files(self.sample_id))[self.dataset_index]))
+ log.debug('Setting status "%s" for sample "%s"' % ( status, str(dataset_index) ) )
+ df = from_json_string(self.galaxydb.get_sample_dataset_files(self.sample_id))
+ if dataset_index == 'All':
+ for dataset in self.dataset_files:
+ df[dataset['index']]['status'] = status
+ if status == 'Error':
+ df[dataset['index']]['error_msg'] = msg
+ else:
+ df[dataset['index']]['error_msg'] = ''
+
+ else:
+ df[dataset_index]['status'] = status
+ if status == 'Error':
+ df[dataset_index]['error_msg'] = msg
+ else:
+ df[dataset_index]['error_msg'] = ''
+
+ self.galaxydb.set_sample_dataset_files(self.sample_id, to_json_string(df))
+ log.debug('done.')
except:
- logging.error(traceback.format_exc())
- logging.error('FATAL ERROR')
+ log.error(traceback.format_exc())
+ log.error('FATAL ERROR')
sys.exit(1)
+
+ def get_value(self, dom, tag_name):
+ '''
+ This method extracts the tag value from the xml message
+ '''
+ nodelist = dom.getElementsByTagName(tag_name)[0].childNodes
+ rc = ""
+ for node in nodelist:
+ if node.nodeType == node.TEXT_NODE:
+ rc = rc + node.data
+ return rc
+
+ def get_value_index(self, dom, tag_name, index):
+ '''
+ This method extracts the tag value from the xml message
+ '''
+ try:
+ nodelist = dom.getElementsByTagName(tag_name)[index].childNodes
+ except:
+ return None
+ rc = ""
+ for node in nodelist:
+ if node.nodeType == node.TEXT_NODE:
+ rc = rc + node.data
+ return rc
if __name__ == '__main__':
- logging.info('STARTING %i %s' % (os.getpid(), str(sys.argv)))
- logging.info('daemonized %i' % os.getpid())
+ log.info('STARTING %i %s' % (os.getpid(), str(sys.argv)))
#
# Start the daemon
- #
- dt = DataTransfer(sys.argv[1], sys.argv[2], sys.argv[3], sys.argv[4],
- int(sys.argv[5]), int(sys.argv[6]), sys.argv[7], sys.argv[8])
+ #
+ dt = DataTransfer(sys.argv[1])
dt.start()
sys.exit(0)
diff -r 207d0d70483b -r 076f572d7c9d scripts/galaxy_messaging/server/galaxydb_interface.py
--- a/scripts/galaxy_messaging/server/galaxydb_interface.py Tue Apr 20 15:36:03 2010 -0400
+++ b/scripts/galaxy_messaging/server/galaxydb_interface.py Wed Apr 21 10:41:30 2010 -0400
@@ -20,8 +20,8 @@
from sqlalchemy import *
from sqlalchemy.orm import sessionmaker
-logging.basicConfig(level=logging.DEBUG)
-log = logging.getLogger( 'GalaxyDbInterface' )
+#logging.basicConfig(level=logging.DEBUG)
+#log = logging.getLogger( 'GalaxyDbInterface' )
class GalaxyDbInterface(object):
@@ -53,9 +53,8 @@
x = result.fetchone()
if x:
sample_id = x[0]
- log.debug('Sample ID: %i' % sample_id)
+ #log.debug('Sample ID: %i' % sample_id)
return sample_id
- log.warning('This sample %s %s does not belong to any sample in the database.' % (field_name, value))
return -1
def current_state(self, sample_id):
@@ -74,16 +73,16 @@
subsubquery = select(columns=[self.sample_table.c.request_id],
whereclause=self.sample_table.c.id==sample_id)
self.request_id = subsubquery.execute().fetchall()[0][0]
- log.debug('REQUESTID: %i' % self.request_id)
+ #log.debug('REQUESTID: %i' % self.request_id)
subquery = select(columns=[self.request_table.c.request_type_id],
whereclause=self.request_table.c.id==self.request_id)
request_type_id = subquery.execute().fetchall()[0][0]
- log.debug('REQUESTTYPEID: %i' % request_type_id)
+ #log.debug('REQUESTTYPEID: %i' % request_type_id)
query = select(columns=[self.state_table.c.id, self.state_table.c.name],
whereclause=self.state_table.c.request_type_id==request_type_id,
order_by=self.state_table.c.id.asc())
states = query.execute().fetchall()
- log.debug('POSSIBLESTATES: '+ str(states))
+ #log.debug('POSSIBLESTATES: '+ str(states))
return states
def change_state(self, sample_id, new_state=None):
@@ -100,7 +99,7 @@
new_state_id = state_id
if new_state_id == -1:
return
- log.debug('Updating sample_id %i state to %s' % (sample_id, new_state))
+ #log.debug('Updating sample_id %i state to %s' % (sample_id, new_state))
i = self.event_table.insert()
i.execute(update_time=datetime.utcnow(),
create_time=datetime.utcnow(),
@@ -120,7 +119,7 @@
break
if request_complete:
request_state = 'Complete'
- log.debug('Updating request_id %i state to "%s"' % (self.request_id, request_state))
+ #log.debug('Updating request_id %i state to "%s"' % (self.request_id, request_state))
i = self.request_event_table.insert()
i.execute(update_time=datetime.utcnow(),
create_time=datetime.utcnow(),
diff -r 207d0d70483b -r 076f572d7c9d scripts/galaxy_messaging/server/galaxyweb_interface.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/scripts/galaxy_messaging/server/galaxyweb_interface.py Wed Apr 21 10:41:30 2010 -0400
@@ -0,0 +1,132 @@
+import ConfigParser
+import sys, os
+import serial
+import array
+import time
+import optparse,array
+import shutil, traceback
+import urllib,urllib2, cookielib
+
+assert sys.version_info[:2] >= ( 2, 4 )
+new_path = [ os.path.join( os.getcwd(), "lib" ) ]
+new_path.extend( sys.path[1:] ) # remove scripts/ from the path
+sys.path = new_path
+
+from galaxy import eggs
+import pkg_resources
+
+import pkg_resources
+pkg_resources.require( "pycrypto" )
+
+from Crypto.Cipher import Blowfish
+from Crypto.Util.randpool import RandomPool
+from Crypto.Util import number
+
+
+class GalaxyWebInterface(object):
+ def __init__(self, server_host, server_port, datatx_email, datatx_password):
+ self.server_host = server_host#config.get("main", "server_host")
+ self.server_port = server_port#config.get("main", "server_port")
+ self.datatx_email = datatx_email#config.get("main", "datatx_email")
+ self.datatx_password = datatx_password#config.get("main", "datatx_password")
+ try:
+ # create url
+ self.base_url = "http://%s:%s" % (self.server_host, self.server_port)
+ # login
+ url = "%s/user/login?email=%s&password=%s&login_button=Login" % (self.base_url, self.datatx_email, self.datatx_password)
+ cj = cookielib.CookieJar()
+ self.opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
+ #print url
+ f = self.opener.open(url)
+ if f.read().find("ogged in as "+self.datatx_email) == -1:
+ # if the user doesnt exist, create the user
+ url = "%s/user/create?email=%s&username=%s&password=%s&confirm=%s&create_user_button=Submit" % ( self.base_url, self.datatx_email, self.datatx_email, self.datatx_password, self.datatx_password )
+ f = self.opener.open(url)
+ if f.read().find("ogged in as "+self.datatx_email) == -1:
+ raise "The "+self.datatx_email+" user could not login to Galaxy"
+ except:
+ print traceback.format_exc()
+ sys.exit(1)
+
+ def add_to_library(self, server_dir, library_id, folder_id, dbkey=''):
+ '''
+ This method adds the dataset file to the target data library & folder
+ by opening the corresponding url in Galaxy server running.
+ '''
+ try:
+ params = urllib.urlencode(dict( cntrller='library_admin',
+ tool_id='upload1',
+ tool_state='None',
+ library_id=self.encode_id(library_id),
+ folder_id=self.encode_id(folder_id),
+ upload_option='upload_directory',
+ file_type='auto',
+ server_dir=os.path.basename(server_dir),
+ dbkey=dbkey,
+ show_dataset_id='True',
+ runtool_btn='Upload to library'))
+ #url = "http://localhost:8080/library_common/upload_library_dataset?cntrller=librar…"
+ #url = base_url+"/library_common/upload_library_dataset?library_id=adb5f5c93f827949&tool_id=upload1&file_type=auto&server_dir=datatx_22858&dbkey=%3F&upload_option=upload_directory&folder_id=529fd61ab1c6cc36&cntrller=library_admin&tool_state=None&runtool_btn=Upload+to+library"
+ url = self.base_url+"/library_common/upload_library_dataset"
+ #print url
+ #print params
+ f = self.opener.open(url, params)
+ if f.read().find("Data Library") == -1:
+ raise "Dataset could not be uploaded to the data library"
+ except:
+ print traceback.format_exc()
+ sys.exit(1)
+
+ def import_to_history(self, ldda_id, library_id, folder_id):
+ try:
+ params = urllib.urlencode(dict( cntrller='library_admin',
+ show_deleted='False',
+ library_id=self.encode_id(library_id),
+ folder_id=self.encode_id(folder_id),
+ ldda_ids=self.encode_id(ldda_id),
+ do_action='import_to_history',
+ use_panels='False'))
+ #url = "http://lion.bx.psu.edu:8080/library_common/act_on_multiple_datasets?library…"
+ #url = base_url+"/library_common/upload_library_dataset?library_id=adb5f5c93f827949&tool_id=upload1&file_type=auto&server_dir=datatx_22858&dbkey=%3F&upload_option=upload_directory&folder_id=529fd61ab1c6cc36&cntrller=library_admin&tool_state=None&runtool_btn=Upload+to+library"
+ url = self.base_url+"/library_common/act_on_multiple_datasets"
+ #print url
+ #print params
+ f = self.opener.open(url, params)
+ x = f.read()
+ if x.find("1 dataset(s) have been imported into your history.") == -1:
+ #print x
+ raise Exception("Dataset could not be imported into history")
+ except:
+ print traceback.format_exc()
+ sys.exit(1)
+
+
+ def run_workflow(self, workflow_id, hid, workflow_step):
+ input = str(workflow_step)+'|input'
+ try:
+ params = urllib.urlencode({'id':self.encode_id(workflow_id),
+ 'run_workflow': 'Run workflow',
+ input: hid})
+ url = self.base_url+"/workflow/run"
+ #print url+'?'+params
+ f = self.opener.open(url, params)
+# if f.read().find("1 dataset(s) have been imported into your history.") == -1:
+# raise Exception("Error in running the workflow")
+ except:
+ print traceback.format_exc()
+ sys.exit(1)
+
+
+ def logout(self):
+ # finally logout
+ f = self.opener.open(self.base_url+'/user/logout')
+
+ def encode_id(self, obj_id ):
+ id_secret = 'changethisinproductiontoo'
+ id_cipher = Blowfish.new( id_secret )
+ # Convert to string
+ s = str( obj_id )
+ # Pad to a multiple of 8 with leading "!"
+ s = ( "!" * ( 8 - len(s) % 8 ) ) + s
+ # Encrypt
+ return id_cipher.encrypt( s ).encode( 'hex' )
diff -r 207d0d70483b -r 076f572d7c9d templates/admin/requests/dataset.mako
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/templates/admin/requests/dataset.mako Wed Apr 21 10:41:30 2010 -0400
@@ -0,0 +1,71 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<br/>
+<br/>
+
+<ul class="manage-table-actions">
+ <li>
+ <a class="action-button" href="${h.url_for( controller='requests_admin', action='show_datatx_page', sample_id=trans.security.encode_id(sample.id) )}">
+ <span>Dataset transfer page</span></a>
+ </li>
+</ul>
+
+<div class="toolForm">
+ <div class="toolFormTitle">Dataset details</div>
+ <div class="toolFormBody">
+ <form name="dataset_details" action="${h.url_for( controller='requests_admin', action='dataset_details', save_changes=True, sample_id=trans.security.encode_id(sample.id), dataset_index=dataset_index )}" method="post" >
+ <%
+ dataset = sample.dataset_files[dataset_index]
+ %>
+ <div class="form-row">
+ <label>Name:</label>
+ <div style="float: left; width: 250px; margin-right: 10px;">
+ %if dataset['status'] in [sample.transfer_status.IN_QUEUE, sample.transfer_status.NOT_STARTED]:
+ <input type="text" name="name" value="${dataset['name']}" size="60"/>
+ %else:
+ ${dataset['name']}
+ %endif
+
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>File on the Sequencer:</label>
+ <div style="float: left; width: 250px; margin-right: 10px;">
+ ${dataset['filepath']}
+ ##<input type="text" name="filepath" value="${dataset['filepath']}" size="100" readonly/>
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>Size:</label>
+ <div style="float: left; width: 250px; margin-right: 10px;">
+ ${dataset.get('size', 'Unknown')}
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>Transfer status:</label>
+ <div style="float: left; width: 250px; margin-right: 10px;">
+ ${dataset['status']}
+ <br/>
+ %if dataset['status'] == sample.transfer_status.ERROR:
+ ${dataset['error_msg']}
+ %endif
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ %if dataset['status'] in [sample.transfer_status.IN_QUEUE, sample.transfer_status.NOT_STARTED]:
+ <div class="form-row">
+ <input type="submit" name="save" value="Save"/>
+ </div>
+ %endif
+ </form>
+ </div>
+</div>
\ No newline at end of file
diff -r 207d0d70483b -r 076f572d7c9d templates/admin/requests/get_data.mako
--- a/templates/admin/requests/get_data.mako Tue Apr 20 15:36:03 2010 -0400
+++ b/templates/admin/requests/get_data.mako Wed Apr 21 10:41:30 2010 -0400
@@ -53,29 +53,44 @@
<div class="toolForm">
%if len(dataset_files):
## <form name="get_data" action="${h.url_for( controller='requests_admin', action='get_data', sample_id=sample.id)}" method="post" >
+ <div class="form-row">
+ <h4>Sample Dataset(s)</h4>
+ %if sample.untransferred_dataset_files():
<div class="form-row">
- <h4>Sample Dataset(s)</h4>
- <div class="form-row">
- <table class="grid">
- <thead>
- <tr>
- <th>Dataset File</th>
- <th>Transfer Status</th>
- <th></th>
- </tr>
- <thead>
- <tbody>
- %for dataset_index, dataset_file in enumerate(dataset_files):
- ${sample_dataset_files( dataset_index, dataset_file[0], dataset_file[1] )}
- %endfor
- </tbody>
- </table>
- </div>
- </div>
+ <ul class="manage-table-actions">
+ <li>
+ <a class="action-button" href="${h.url_for( controller='requests_admin', action='get_data', start_transfer_button=True, sample_id=sample.id )}">
+ <span>Start transfer</span></a>
+ </li>
+ </ul>
+ </div>
+ %endif
+ <div class="form-row">
+ <table class="grid">
+ <thead>
+ <tr>
+ <th>Dataset File</th>
+ <th>Transfer Status</th>
+ <th></th>
+ </tr>
+ <thead>
+ <tbody>
+ %for dataset_index, dataset_file in enumerate(dataset_files):
+ ${sample_dataset_files( dataset_index, dataset_file['name'], dataset_file['status'] )}
+ %endfor
+ </tbody>
+ </table>
+ </div>
+ </div>
+
## </form>
##</div>
+
+
+<br/>
<br/>
%endif
+
##<div class="toolForm">
<form name="get_data" action="${h.url_for( controller='requests_admin', action='get_data', sample_id=sample.id)}" method="post" >
<div class="form-row">
@@ -102,24 +117,24 @@
navigate away from this page. Once the transfer is complete
the dataset(s) will show up on this page.
</div>
- <input type="submit" name="start_transfer_button" value="Transfer"/>
+ <input type="submit" name="select_files_button" value="Select"/>
</div>
</div>
</div>
</form>
</div>
-<%def name="sample_dataset_files( dataset_index, dataset_file, status )">
+<%def name="sample_dataset_files( dataset_index, dataset_name, status )">
<tr>
<td>
-## <label class="msg_head"><a href="${h.url_for( controller='requests_admin', action='show_dataset_file', sample_id=trans.security.encode_id(sample.id), dataset_index=dataset_index )}">${dataset_file.split('/')[-1]}</a></label>
- <div class="msg_head"><u>${dataset_file.split('/')[-1]}</u></div>
- <div class="msg_body">
- ${dataset_file}
- </div>
+ <label class="msg_head"><a href="${h.url_for( controller='requests_admin', action='dataset_details', sample_id=trans.security.encode_id(sample.id), dataset_index=dataset_index )}">${dataset_name}</a></label>
+## <div class="msg_head"><u>${dataset_file.split('/')[-1]}</u></div>
+## <div class="msg_body">
+## ${dataset_file}
+## </div>
</td>
<td>
- %if status == sample.transfer_status.IN_PROGRESS:
+ %if status not in [sample.transfer_status.NOT_STARTED, sample.transfer_status.COMPLETE]:
<i>${status}</i>
%else:
${status}
diff -r 207d0d70483b -r 076f572d7c9d universe_wsgi.ini.sample
--- a/universe_wsgi.ini.sample Tue Apr 20 15:36:03 2010 -0400
+++ b/universe_wsgi.ini.sample Wed Apr 21 10:41:30 2010 -0400
@@ -287,7 +287,7 @@
# to be set up with a user account and other parameters listed below. The 'host'
# and 'port' fields should point to where the RabbitMQ server is running.
-#[galaxy:amqp]
+[galaxy_amqp]
#host = 127.0.0.1
#port = 5672
#userid = galaxy
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/207d0d70483b
changeset: 3673:207d0d70483b
user: Kanwei Li <kanwei(a)gmail.com>
date: Tue Apr 20 15:36:03 2010 -0400
description:
Fix history renaming on Saved Histories grid
diffstat:
lib/galaxy/web/controllers/history.py | 3 ++-
1 files changed, 2 insertions(+), 1 deletions(-)
diffs (13 lines):
diff -r 18d0d7fd543a -r 207d0d70483b lib/galaxy/web/controllers/history.py
--- a/lib/galaxy/web/controllers/history.py Tue Apr 20 13:19:44 2010 -0400
+++ b/lib/galaxy/web/controllers/history.py Tue Apr 20 15:36:03 2010 -0400
@@ -177,7 +177,8 @@
operation = kwargs['operation'].lower()
if operation == "share or publish":
return self.sharing( trans, **kwargs )
- if operation == "rename":
+ if operation == "rename" and kwargs.get('id', None): # Don't call rename if no ids
+ del kwargs['name'] # Remove ajax name param that rename method uses
return self.rename( trans, **kwargs )
history_ids = util.listify( kwargs.get( 'id', [] ) )
# Display no message by default
1
0

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/18d0d7fd543a
changeset: 3672:18d0d7fd543a
user: Kanwei Li <kanwei(a)gmail.com>
date: Tue Apr 20 13:19:44 2010 -0400
description:
GFF to Bed converter now converts the spaces to underscores to avoid UCSC problem [Brad Chapman] Closes #323
diffstat:
lib/galaxy/datatypes/converters/gff_to_bed_converter.py | 5 ++++-
1 files changed, 4 insertions(+), 1 deletions(-)
diffs (15 lines):
diff -r 7cb131814770 -r 18d0d7fd543a lib/galaxy/datatypes/converters/gff_to_bed_converter.py
--- a/lib/galaxy/datatypes/converters/gff_to_bed_converter.py Mon Apr 19 17:43:39 2010 -0400
+++ b/lib/galaxy/datatypes/converters/gff_to_bed_converter.py Tue Apr 20 13:19:44 2010 -0400
@@ -21,7 +21,10 @@
strand = '+'
# GFF format: chrom source, name, chromStart, chromEnd, score, strand
# Bed format: chrom, chromStart, chromEnd, name, score, strand
- out.write( "%s\t%s\t%s\t%s\t0\t%s\n" %( elems[0], start, elems[4], elems[2], strand ) )
+ #
+ # Replace any spaces in the name with underscores so UCSC will not complain
+ name = elems[2].replace(" ", "_")
+ out.write( "%s\t%s\t%s\t%s\t0\t%s\n" %( elems[0], start, elems[4], name, strand ) )
except:
skipped_lines += 1
if not first_skipped_line:
1
0

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/7cb131814770
changeset: 3671:7cb131814770
user: James Taylor <james(a)jamestaylor.org>
date: Mon Apr 19 17:43:39 2010 -0400
description:
Fix ordering in display_structured, also use insane eagerloading to make it massively faster
diffstat:
lib/galaxy/util/odict.py | 10 +++++++---
lib/galaxy/web/controllers/history.py | 25 ++++++++++++++++---------
templates/history/display_structured.mako | 13 ++++++++++---
3 files changed, 33 insertions(+), 15 deletions(-)
diffs (128 lines):
diff -r b8d25aabb98d -r 7cb131814770 lib/galaxy/util/odict.py
--- a/lib/galaxy/util/odict.py Tue Apr 20 11:47:51 2010 -0400
+++ b/lib/galaxy/util/odict.py Mon Apr 19 17:43:39 2010 -0400
@@ -31,9 +31,9 @@
self._keys = []
def copy(self):
- new = odict()
- new.update( self )
- return new
+ new = odict()
+ new.update( self )
+ return new
def items(self):
return zip(self._keys, self.values())
@@ -82,3 +82,7 @@
def __iter__( self ):
for key in self._keys:
yield key
+
+ def reverse( self ):
+ self._keys.reverse()
+
diff -r b8d25aabb98d -r 7cb131814770 lib/galaxy/web/controllers/history.py
--- a/lib/galaxy/web/controllers/history.py Tue Apr 20 11:47:51 2010 -0400
+++ b/lib/galaxy/web/controllers/history.py Mon Apr 19 17:43:39 2010 -0400
@@ -1,6 +1,7 @@
from galaxy.web.base.controller import *
from galaxy.web.framework.helpers import time_ago, iff, grids
from galaxy import util
+from galaxy.util.odict import odict
from galaxy.model.mapping import desc
from galaxy.model.orm import *
from galaxy.util.json import *
@@ -336,25 +337,31 @@
"""
# Get history
if id is None:
- history = trans.history
+ id = trans.history.id
else:
id = trans.security.decode_id( id )
- history = trans.sa_session.query( model.History ).get( id )
- assert history
- assert history.user and ( history.user == trans.user ) or ( history == trans.history )
+ # Expunge history from the session to allow us to force a reload
+ # with a bunch of eager loaded joins
+ trans.sa_session.expunge( trans.history )
+ history = trans.sa_session.query( model.History ).options(
+ eagerload_all( 'active_datasets.creating_job_associations.job.workflow_invocation_step.workflow_invocation.workflow' ),
+ eagerload_all( 'active_datasets.children' )
+ ).get( id )
+ assert history
+ assert history.user and ( history.user.id == trans.user.id ) or ( history.id == trans.history.id )
# Resolve jobs and workflow invocations for the datasets in the history
# items is filled with items (hdas, jobs, or workflows) that go at the
# top level
items = []
# First go through and group hdas by job, if there is no job they get
# added directly to items
- jobs = dict()
+ jobs = odict()
for hda in history.active_datasets:
# Follow "copied from ..." association until we get to the original
# instance of the dataset
original_hda = hda
- while original_hda.copied_from_history_dataset_association:
- original_hda = original_hda.copied_from_history_dataset_association
+ ## while original_hda.copied_from_history_dataset_association:
+ ## original_hda = original_hda.copied_from_history_dataset_association
# Check if the job has a creating job, most should, datasets from
# before jobs were tracked, or from the upload tool before it
# created a job, may not
@@ -370,7 +377,7 @@
else:
jobs[ job ] = [ ( hda, None ) ]
# Second, go through the jobs and connect to workflows
- wf_invocations = dict()
+ wf_invocations = odict()
for job, hdas in jobs.iteritems():
# Job is attached to a workflow step, follow it to the
# workflow_invocation and group
@@ -1025,4 +1032,4 @@
msg = 'Clone with name "%s" is now included in your previously stored histories.' % new_history.name
else:
msg = '%d cloned histories are now included in your previously stored histories.' % len( histories )
- return trans.show_ok_message( msg )
\ No newline at end of file
+ return trans.show_ok_message( msg )
diff -r b8d25aabb98d -r 7cb131814770 templates/history/display_structured.mako
--- a/templates/history/display_structured.mako Tue Apr 20 11:47:51 2010 -0400
+++ b/templates/history/display_structured.mako Mon Apr 19 17:43:39 2010 -0400
@@ -16,6 +16,7 @@
.workflow {
border: solid gray 1px;
+ margin: 5px 0;
border-left-width: 5px;
}
@@ -96,9 +97,15 @@
<%def name="render_item_job( job, children )">
<div class="tool toolForm">
- <div class="header toolFormTitle">Tool: ${trans.app.toolbox.tools_by_id[job.tool_id].name}</div>
+ <%
+ if job.tool_id in trans.app.toolbox.tools_by_id:
+ tool_name = trans.app.toolbox.tools_by_id[job.tool_id].name
+ else:
+ tool_name = "Unknown tool with id '%s'" % job.tool_id
+ %>
+ <div class="header toolFormTitle">Tool: ${tool_name}</div>
<div class="body toolFormBody">
- %for e, c in children:
+ %for e, c in reversed( children ):
${render_item( e, c )}
%endfor
</div>
@@ -111,7 +118,7 @@
<div class="workflow">
<div class="header">Workflow: ${wf.workflow.name}</div>
<div class="body">
- %for e, c in children:
+ %for e, c in reversed( children ):
${render_item( e, c )}
%endfor
</div>
1
0

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/b8d25aabb98d
changeset: 3670:b8d25aabb98d
user: Dan Blankenberg <dan(a)bx.psu.edu>
date: Tue Apr 20 11:47:51 2010 -0400
description:
Make failing to load datatype converters and display applications more graceful when given nonexistent file paths. A missing converter will still allow the application to start and a missing display application will not prevent a datatype from loading.
diffstat:
lib/galaxy/datatypes/registry.py | 46 ++++++++++++++++++++++-----------------
1 files changed, 26 insertions(+), 20 deletions(-)
diffs (63 lines):
diff -r e47ff545931f -r b8d25aabb98d lib/galaxy/datatypes/registry.py
--- a/lib/galaxy/datatypes/registry.py Mon Apr 19 17:41:35 2010 -0400
+++ b/lib/galaxy/datatypes/registry.py Tue Apr 20 11:47:51 2010 -0400
@@ -90,20 +90,22 @@
mimetype = composite_file.get( 'mimetype', None )
self.datatypes_by_extension[extension].add_composite_file( name, optional=optional, mimetype=mimetype )
for display_app in elem.findall( 'display' ):
- display_file = display_app.get( 'file', None )
- assert display_file is not None, "A file must be specified for a datatype display tag."
- inherit = galaxy.util.string_as_bool( display_app.get( 'inherit', 'False' ) )
- display_app = DisplayApplication.from_file( os.path.join( self.display_applications_path, display_file ), self )
- if display_app:
- if display_app.id in self.display_applications:
- #if we already loaded this display application, we'll use the first one again
- display_app = self.display_applications[ display_app.id ]
- self.log.debug( "Loaded display application '%s' for datatype '%s', inherit=%s" % ( display_app.id, extension, inherit ) )
- self.display_applications[ display_app.id ] = display_app #Display app by id
- self.datatypes_by_extension[ extension ].add_display_application( display_app )
- if inherit and ( self.datatypes_by_extension[extension], display_app ) not in inherit_display_application_by_class:
- #subclass inheritance will need to wait until all datatypes have been loaded
- inherit_display_application_by_class.append( ( self.datatypes_by_extension[extension], display_app ) )
+ display_file = os.path.join( self.display_applications_path, display_app.get( 'file', None ) )
+ try:
+ inherit = galaxy.util.string_as_bool( display_app.get( 'inherit', 'False' ) )
+ display_app = DisplayApplication.from_file( display_file, self )
+ if display_app:
+ if display_app.id in self.display_applications:
+ #if we already loaded this display application, we'll use the first one again
+ display_app = self.display_applications[ display_app.id ]
+ self.log.debug( "Loaded display application '%s' for datatype '%s', inherit=%s" % ( display_app.id, extension, inherit ) )
+ self.display_applications[ display_app.id ] = display_app #Display app by id
+ self.datatypes_by_extension[ extension ].add_display_application( display_app )
+ if inherit and ( self.datatypes_by_extension[extension], display_app ) not in inherit_display_application_by_class:
+ #subclass inheritance will need to wait until all datatypes have been loaded
+ inherit_display_application_by_class.append( ( self.datatypes_by_extension[extension], display_app ) )
+ except:
+ self.log.exception( "error reading display application from path: %s" % display_file )
except Exception, e:
self.log.warning( 'Error loading datatype "%s", problem: %s' % ( extension, str( e ) ) )
# Handle display_application subclass inheritance here:
@@ -290,12 +292,16 @@
tool_config = elem[0]
source_datatype = elem[1]
target_datatype = elem[2]
- converter = toolbox.load_tool( os.path.join( self.datatype_converters_path, tool_config ) )
- toolbox.tools_by_id[converter.id] = converter
- if source_datatype not in self.datatype_converters:
- self.datatype_converters[source_datatype] = odict()
- self.datatype_converters[source_datatype][target_datatype] = converter
- self.log.debug( "Loaded converter: %s", converter.id )
+ converter_path = os.path.join( self.datatype_converters_path, tool_config )
+ try:
+ converter = toolbox.load_tool( converter_path )
+ toolbox.tools_by_id[converter.id] = converter
+ if source_datatype not in self.datatype_converters:
+ self.datatype_converters[source_datatype] = odict()
+ self.datatype_converters[source_datatype][target_datatype] = converter
+ self.log.debug( "Loaded converter: %s", converter.id )
+ except:
+ self.log.exception( "error reading converter from path: %s" % converter_path )
def load_external_metadata_tool( self, toolbox ):
"""Adds a tool which is used to set external metadata"""
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/e47ff545931f
changeset: 3669:e47ff545931f
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Mon Apr 19 17:41:35 2010 -0400
description:
Cuffcompare wrapper.
diffstat:
tools/ngs_rna/cuffcompare_wrapper.py | 86 ++++++++++++++++++++
tools/ngs_rna/cuffcompare_wrapper.xml | 142 ++++++++++++++++++++++++++++++++++
2 files changed, 228 insertions(+), 0 deletions(-)
diffs (237 lines):
diff -r 91b8f0abffc8 -r e47ff545931f tools/ngs_rna/cuffcompare_wrapper.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/ngs_rna/cuffcompare_wrapper.py Mon Apr 19 17:41:35 2010 -0400
@@ -0,0 +1,86 @@
+#!/usr/bin/env python
+
+import optparse, os, shutil, subprocess, sys, tempfile
+
+def stop_err( msg ):
+ sys.stderr.write( "%s\n" % msg )
+ sys.exit()
+
+def __main__():
+ #Parse Command Line
+ parser = optparse.OptionParser()
+ parser.add_option( '-r', dest='ref_annotation', help='An optional "reference" annotation GTF. Each sample is matched against this file, and sample isoforms are tagged as overlapping, matching, or novel where appropriate. See the refmap and tmap output file descriptions below.' )
+ parser.add_option( '-R', action="store_true", dest='ignore_nonoverlap', help='If -r was specified, this option causes cuffcompare to ignore reference transcripts that are not overlapped by any transcript in one of cuff1.gtf,...,cuffN.gtf. Useful for ignoring annotated transcripts that are not present in your RNA-Seq samples and thus adjusting the "sensitivity" calculation in the accuracy report written in the transcripts accuracy file' )
+
+ # Wrapper / Galaxy options.
+ parser.add_option( '-A', '--transcripts-accuracy-output', dest='transcripts_accuracy_output_file', help='' )
+ parser.add_option( '-B', '--transcripts-combined-output', dest='transcripts_combined_output_file', help='' )
+ parser.add_option( '-C', '--transcripts-tracking-output', dest='transcripts_tracking_output_file', help='' )
+
+ (options, args) = parser.parse_args()
+
+ # Make temp directory for output.
+ tmp_output_dir = tempfile.mkdtemp()
+
+ # Build command.
+
+ # Base.
+ cmd = "cuffcompare -o cc_output"
+
+ # Add options.
+ if options.ref_annotation:
+ cmd += " -r %s" % options.ref_annotation
+ if options.ignore_nonoverlap:
+ cmd += " -R "
+
+ # Add input files.
+ if type(args) is list:
+ args = " ".join(args)
+ cmd += " " + args
+ print cmd
+
+ # Run command.
+ try:
+ tmp_name = tempfile.NamedTemporaryFile( dir=tmp_output_dir ).name
+ tmp_stderr = open( tmp_name, 'wb' )
+ proc = subprocess.Popen( args=cmd, shell=True, cwd=tmp_output_dir, stderr=tmp_stderr.fileno() )
+ returncode = proc.wait()
+ tmp_stderr.close()
+
+ # Get stderr, allowing for case where it's very large.
+ tmp_stderr = open( tmp_name, 'rb' )
+ stderr = ''
+ buffsize = 1048576
+ try:
+ while True:
+ stderr += tmp_stderr.read( buffsize )
+ if not stderr or len( stderr ) % buffsize != 0:
+ break
+ except OverflowError:
+ pass
+ tmp_stderr.close()
+
+ # Error checking.
+ if returncode != 0:
+ raise Exception, stderr
+
+ # check that there are results in the output file
+ if len( open( tmp_output_dir + "/cc_output", 'rb' ).read().strip() ) == 0:
+ raise Exception, 'The main output file is empty, there may be an error with your input file or settings.'
+ except Exception, e:
+ stop_err( 'Error running cuffcompare. ' + str( e ) )
+
+ # Copy output files from tmp directory to specified files.
+ try:
+ try:
+ shutil.copyfile( tmp_output_dir + "/cc_output", options.transcripts_accuracy_output_file )
+ shutil.copyfile( tmp_output_dir + "/cc_output.combined.gtf", options.transcripts_combined_output_file )
+ shutil.copyfile( tmp_output_dir + "/cc_output.tracking", options.transcripts_tracking_output_file )
+ except Exception, e:
+ stop_err( 'Error in cuffcompare:\n' + str( e ) )
+ finally:
+ # Clean up temp dirs
+ if os.path.exists( tmp_output_dir ):
+ shutil.rmtree( tmp_output_dir )
+
+if __name__=="__main__": __main__()
\ No newline at end of file
diff -r 91b8f0abffc8 -r e47ff545931f tools/ngs_rna/cuffcompare_wrapper.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/ngs_rna/cuffcompare_wrapper.xml Mon Apr 19 17:41:35 2010 -0400
@@ -0,0 +1,142 @@
+<tool id="cuffcompare" name="Cuffcompare" version="0.8.2">
+ <description>compare assembled transcripts to a reference annotation and track Cufflinks transcripts across multiple experiments</description>
+ <command interpreter="python">
+ cuffcompare_wrapper.py
+ --transcripts-accuracy-output=$transcripts_accuracy
+ --transcripts-combined-output=$transcripts_combined
+ --transcripts-tracking-output=$transcripts_tracking
+ #if $annotation.use_ref_annotation == "Yes":
+ -r $annotation.reference_annotation
+ #if $annotation.ignore_nonoverlapping_reference:
+ -R
+ #end if
+ #end if
+ $input1
+ $input2
+ </command>
+ <inputs>
+ <param format="gtf" name="input1" type="data" label="SAM file of aligned RNA-Seq reads" help=""/>
+ <param format="gtf" name="input2" type="data" label="SAM file of aligned RNA-Seq reads" help=""/>
+ <conditional name="annotation">
+ <param name="use_ref_annotation" type="select" label="Use Reference Annotation?">
+ <option value="No">No</option>
+ <option value="Yes">Yes</option>
+ </param>
+ <when value="Yes">
+ <param format="gtf" name="reference_annotation" type="data" label="Reference Annotation" help=""/>
+ <param name="ignore_nonoverlapping_reference" type="boolean" label="Ignore reference transcripts that are not overlapped by any transcript in input files"/>
+ </when>
+ <when value="No">
+ </when>
+ </conditional>
+ </inputs>
+
+ <outputs>
+ <data format="gtf" name="transcripts_combined" />
+ <data format="tracking" name="transcripts_tracking" />
+ <data format="gtf" name="transcripts_accuracy" />
+ </outputs>
+
+ <tests>
+ <test>
+ </test>
+ </tests>
+
+ <help>
+**Cuffcompare Overview**
+
+Cuffcompare is part of Cufflinks_. Cuffcompare helps you: (a) compare your assembled transcripts to a reference annotation and (b) track Cufflinks transcripts across multiple experiments (e.g. across a time course). Please cite: Trapnell C, Williams BA, Pertea G, Mortazavi AM, Kwan G, van Baren MJ, Salzberg SL, Wold B, Pachter L. Transcript assembly and abundance estimation from RNA-Seq reveals thousands of new transcripts and switching among isoforms. (manuscript in press)
+
+.. _Cufflinks: http://cufflinks.cbcb.umd.edu/
+
+------
+
+**Know what you are doing**
+
+.. class:: warningmark
+
+There is no such thing (yet) as an automated gearshift in expression analysis. It is all like stick-shift driving in San Francisco. In other words, running this tool with default parameters will probably not give you meaningful results. A way to deal with this is to **understand** the parameters by carefully reading the `documentation`__ and experimenting. Fortunately, Galaxy makes experimenting easy.
+
+.. __: http://cufflinks.cbcb.umd.edu/manual.html#cuffcompare
+
+------
+
+**Input format**
+
+Cuffcompare takes Cufflinks' GTF output as input, and optionally can take a "reference" annotation (such as from Ensembl___)
+
+.. ___: http://www.todo.org
+
+------
+
+**Outputs**
+
+Cuffcompare produces the following output files:
+
+Transcripts Accuracy File:
+
+Cuffcompare reports various statistics related to the "accuracy" of the transcripts in each sample when compared to the reference annotation data. The typical gene finding measures of "sensitivity" and "specificity" (as defined in Burset, M., Guigó, R. : Evaluation of gene structure prediction programs (1996) Genomics, 34 (3), pp. 353-367. doi: 10.1006/geno.1996.0298) are calculated at various levels (nucleotide, exon, intron, transcript, gene) for each input file and reported in this file. The Sn and Sp columns show specificity and sensitivity values at each level, while the fSn and fSp columns are "fuzzy" variants of these same accuracy calculations, allowing for a very small variation in exon boundaries to still be counted as a "match".
+
+Transcripts Combined File:
+
+Cuffcompare reports a GTF file containing the "union" of all transfrags in each sample. If a transfrag is present in both samples, it is thus reported once in the combined gtf.
+
+Transcripts Tracking File:
+
+This file matches transcripts up between samples. Each row contains a transcript structure that is present in one or more input GTF files. Because the transcripts will generally have different IDs (unless you assembled your RNA-Seq reads against a reference transcriptome), cuffcompare examines the structure of each the transcripts, matching transcripts that agree on the coordinates and order of all of their introns, as well as strand. Matching transcripts are allowed to differ on the length of the first and last exons, since these lengths will naturally vary from sample to sample due to the random nature of sequencing.
+If you ran cuffcompare with the -r option, the first and second columns contain the closest matching reference transcript to the one described by each row.
+
+Here's an example of a line from the tracking file::
+
+ TCONS_00000045 XLOC_000023 Tcea|uc007afj.1 j \
+ q1:exp.115|exp.115.0|100|3.061355|0.350242|0.350207 \
+ q2:60hr.292|60hr.292.0|100|4.094084|0.000000|0.000000
+
+In this example, a transcript present in the two input files, called exp.115.0 in the first and 60hr.292.0 in the second, doesn't match any reference transcript exactly, but shares exons with uc007afj.1, an isoform of the gene Tcea, as indicated by the class code j. The first three columns are as follows::
+
+ Column number Column name Example Description
+ -----------------------------------------------------------------------
+ 1 Cufflinks transfrag id TCONS_00000045 A unique internal id for the transfrag
+ 2 Cufflinks locus id XLOC_000023 A unique internal id for the locus
+ 3 Reference gene id Tcea The gene_name attribute of the reference GTF record for this transcript, or '-' if no reference transcript overlaps this Cufflinks transcript
+ 4 Reference transcript id uc007afj.1 The transcript_id attribute of the reference GTF record for this transcript, or '-' if no reference transcript overlaps this Cufflinks transcript
+ 5 Class code c The type of match between the Cufflinks transcripts in column 6 and the reference transcript. See class codes
+
+Each of the columns after the fifth have the following format:
+ qJ:gene_id|transcript_id|FMI|FPKM|conf_lo|conf_hi
+
+A transcript need be present in all samples to be reported in the tracking file. A sample not containing a transcript will have a "-" in its entry in the row for that transcript.
+
+Class Codes
+
+If you ran cuffcompare with the -r option, tracking rows will contain the following values. If you did not use -r, the rows will all contain "-" in their class code column::
+
+ Priority Code Description
+ ---------------------------------
+ 1 = Match
+ 2 c Contained
+ 3 j New isoform
+ 4 e A single exon transcript overlapping a reference exon and at least 10 bp of a reference intron, indicating a possible pre-mRNA fragment.
+ 5 i A single exon transcript falling entirely with a reference intron
+ 6 r Repeat. Currently determined by looking at the reference sequence and applied to transcripts where at least 50% of the bases are lower case
+ 7 p Possible polymerase run-on fragment
+ 8 u Unknown, intergenic transcript
+ 9 o Unknown, generic overlap with reference
+ 10 . (.tracking file only, indicates multiple classifications)
+
+-------
+
+**Settings**
+
+All of the options have a default value. You can change any of them. Most of the options in Cuffcompare have been implemented here.
+
+------
+
+**Cuffcompare parameter list**
+
+This is a list of implemented Cuffcompare options::
+
+ -r An optional "reference" annotation GTF. Each sample is matched against this file, and sample isoforms are tagged as overlapping, matching, or novel where appropriate. See the refmap and tmap output file descriptions below.
+ -R If -r was specified, this option causes cuffcompare to ignore reference transcripts that are not overlapped by any transcript in one of cuff1.gtf,...,cuffN.gtf. Useful for ignoring annotated transcripts that are not present in your RNA-Seq samples and thus adjusting the "sensitivity" calculation in the accuracy report written in the transcripts_accuracy file
+ </help>
+</tool>
1
0

10 May '10
details: http://www.bx.psu.edu/hg/galaxy/rev/91b8f0abffc8
changeset: 3668:91b8f0abffc8
user: Kanwei Li <kanwei(a)gmail.com>
date: Mon Apr 19 16:03:24 2010 -0400
description:
User-custom dbkeys can now be set for datasets in Edit Attributes and upload
Always use autocomplete for dbkey entry (used to be >20)
diffstat:
lib/galaxy/web/controllers/tracks.py | 15 +++++++--------
lib/galaxy/web/framework/__init__.py | 12 +++++++++---
static/scripts/galaxy.base.js | 8 ++++----
static/scripts/packed/galaxy.base.js | 2 +-
4 files changed, 21 insertions(+), 16 deletions(-)
diffs (104 lines):
diff -r 3c6ffa5362d2 -r 91b8f0abffc8 lib/galaxy/web/controllers/tracks.py
--- a/lib/galaxy/web/controllers/tracks.py Mon Apr 19 14:09:07 2010 -0400
+++ b/lib/galaxy/web/controllers/tracks.py Mon Apr 19 16:03:24 2010 -0400
@@ -21,7 +21,6 @@
from galaxy.web.framework import simplejson
from galaxy.web.framework.helpers import time_ago, grids
from galaxy.util.bunch import Bunch
-from galaxy.util import dbnames
from galaxy.visualization.tracks.data.array_tree import ArrayTreeDataProvider
from galaxy.visualization.tracks.data.interval_index import IntervalIndexDataProvider
@@ -79,7 +78,7 @@
"""
available_tracks = None
- len_dbkeys = None
+ len_files = None
@web.expose
@web.require_login()
@@ -91,17 +90,17 @@
@web.expose
@web.require_login()
def new_browser( self, trans ):
- if not self.len_dbkeys:
+ if not self.len_files:
len_files = glob.glob(os.path.join( trans.app.config.tool_data_path, 'shared','ucsc','chrom', "*.len" ))
- len_files = [ os.path.split(f)[1].split(".len")[0] for f in len_files ] # get xxx.len
- loaded_dbkeys = dbnames
- self.len_dbkeys = [ (k, v) for k, v in loaded_dbkeys if k in len_files ]
+ self.len_files = [ os.path.split(f)[1].split(".len")[0] for f in len_files ] # get xxx.len
- user_keys = None
+ user_keys = {}
user = trans.get_user()
if 'dbkeys' in user.preferences:
user_keys = from_json_string( user.preferences['dbkeys'] )
- return trans.fill_template( "tracks/new_browser.mako", user_keys=user_keys, dbkeys=self.len_dbkeys )
+
+ dbkeys = [ (k, v) for k, v in trans.db_builds if k in self.len_files or k in user_keys ]
+ return trans.fill_template( "tracks/new_browser.mako", dbkeys=dbkeys )
@web.json
@web.require_login()
diff -r 3c6ffa5362d2 -r 91b8f0abffc8 lib/galaxy/web/framework/__init__.py
--- a/lib/galaxy/web/framework/__init__.py Mon Apr 19 14:09:07 2010 -0400
+++ b/lib/galaxy/web/framework/__init__.py Mon Apr 19 16:03:24 2010 -0400
@@ -10,7 +10,7 @@
import base
import pickle
from galaxy import util
-from galaxy.util.json import to_json_string
+from galaxy.util.json import to_json_string, from_json_string
pkg_resources.require( "simplejson" )
import simplejson
@@ -657,10 +657,16 @@
dbnames = list()
datasets = self.sa_session.query( self.app.model.HistoryDatasetAssociation ) \
.filter_by( deleted=False, history_id=self.history.id, extension="len" )
- if datasets.count() > 0:
- dbnames.append( (util.dbnames.default_value, '--------- User Defined Builds ----------') )
+
for dataset in datasets:
dbnames.append( (dataset.dbkey, dataset.name) )
+
+ user = self.get_user()
+ if user and 'dbkeys' in user.preferences:
+ user_keys = from_json_string( user.preferences['dbkeys'] )
+ for key, chrom_dict in user_keys.iteritems():
+ dbnames.append((key, "%s (%s) [Custom]" % (chrom_dict['name'], key) ))
+
dbnames.extend( util.dbnames )
return dbnames
diff -r 3c6ffa5362d2 -r 91b8f0abffc8 static/scripts/galaxy.base.js
--- a/static/scripts/galaxy.base.js Mon Apr 19 14:09:07 2010 -0400
+++ b/static/scripts/galaxy.base.js Mon Apr 19 16:03:24 2010 -0400
@@ -144,13 +144,13 @@
return 0;
}
-// Replace any select box with 20+ options with a text input box + autocomplete.
+// Replace select box with a text input box + autocomplete.
// TODO: make work with dynamic tool inputs and then can replace all big selects.
-function replace_big_select_inputs() {
+function replace_big_select_inputs(min_length) {
$('select[name=dbkey]').each( function() {
var select_elt = $(this);
- // Skip if there are < 20 options.
- if (select_elt.find('option').length < 20)
+ // Skip if # of options < threshold
+ if (min_length !== undefined && select_elt.find('option').length < min_length)
return;
// Replace select with text + autocomplete.
diff -r 3c6ffa5362d2 -r 91b8f0abffc8 static/scripts/packed/galaxy.base.js
--- a/static/scripts/packed/galaxy.base.js Mon Apr 19 14:09:07 2010 -0400
+++ b/static/scripts/packed/galaxy.base.js Mon Apr 19 16:03:24 2010 -0400
@@ -1,1 +1,1 @@
-$(document).ready(function(){replace_big_select_inputs()});$.fn.makeAbsolute=function(a){return this.each(function(){var b=$(this);var c=b.position();b.css({position:"absolute",marginLeft:0,marginTop:0,top:c.top,left:c.left,right:$(window).width()-(c.left+b.width())});if(a){b.remove().appendTo("body")}})};function ensure_popup_helper(){if($("#popup-helper").length===0){$("<div id='popup-helper'/>").css({background:"white",opacity:0,zIndex:15000,position:"absolute",top:0,left:0,width:"100%",height:"100%"}).appendTo("body").hide()}}function attach_popupmenu(b,d){var a=function(){d.unbind().hide();$("#popup-helper").unbind("click.popupmenu").hide()};var c=function(g){$("#popup-helper").bind("click.popupmenu",a).show();d.click(a).css({left:0,top:-1000}).show();var f=g.pageX-d.width()/2;f=Math.min(f,$(document).scrollLeft()+$(window).width()-$(d).width()-20);f=Math.max(f,$(document).scrollLeft()+20);d.css({top:g.pageY-5,left:f});return false};$(b).click(c)}function make_popupmen!
u(c,b){ensure_popup_helper();var a=$("<ul id='"+c.attr("id")+"-menu'></ul>");$.each(b,function(f,e){if(e){$("<li/>").html(f).click(e).appendTo(a)}else{$("<li class='head'/>").html(f).appendTo(a)}});var d=$("<div class='popmenu-wrapper'>");d.append(a).append("<div class='overlay-border'>").css("position","absolute").appendTo("body").hide();attach_popupmenu(c,d)}function make_popup_menus(){jQuery("div[popupmenu]").each(function(){var c={};$(this).find("a").each(function(){var b=$(this).attr("confirm"),d=$(this).attr("href"),e=$(this).attr("target");c[$(this).text()]=function(){if(!b||confirm(b)){var g=window;if(e=="_parent"){g=window.parent}else{if(e=="_top"){g=window.top}}g.location=d}}});var a=$("#"+$(this).attr("popupmenu"));make_popupmenu(a,c);$(this).remove();a.addClass("popup").show()})}function array_length(b){if(b.length){return b.length}var c=0;for(var a in b){c++}return c}function naturalSort(i,g){var n=/(-?[0-9\.]+)/g,j=i.toString().toLowerCase()||"",f=g.toString()!
.toLowerCase()||"",k=String.fromCharCode(0),l=j.replace(n,k+"$1"+k).sp
lit(k),e=f.replace(n,k+"$1"+k).split(k),d=(new Date(j)).getTime(),m=d?(new Date(f)).getTime():null;if(m){if(d<m){return -1}else{if(d>m){return 1}}}for(var h=0,c=Math.max(l.length,e.length);h<c;h++){oFxNcL=parseFloat(l[h])||l[h];oFyNcL=parseFloat(e[h])||e[h];if(oFxNcL<oFyNcL){return -1}else{if(oFxNcL>oFyNcL){return 1}}}return 0}function replace_big_select_inputs(){$("select[name=dbkey]").each(function(){var a=$(this);if(a.find("option").length<20){return}var b=a.attr("value");var c=$("<input type='text' class='text-and-autocomplete-select'></input>");c.attr("size",40);c.attr("name",a.attr("name"));c.attr("id",a.attr("id"));c.click(function(){var h=$(this).attr("value");$(this).attr("value","Loading...");$(this).showAllInCache();$(this).attr("value",h);$(this).select()});var g=[];var f={};a.children("option").each(function(){var i=$(this).text();var h=$(this).attr("value");if(h=="?"){return}g.push(i);f[i]=h;f[h]=h;if(h==b){c.attr("value",i)}});g.push("unspecified (?)");f["unsp!
ecified (?)"]="?";f["?"]="?";if(c.attr("value")==""){c.attr("value","Click to Search or Select")}g=g.sort(naturalSort);var e={selectFirst:false,autoFill:false,mustMatch:false,matchContains:true,max:1000,minChars:0,hideForLessThanMinChars:false};c.autocomplete(g,e);a.replaceWith(c);var d=function(){var i=c.attr("value");var h=f[i];if(h!==null&&h!==undefined){c.attr("value",h)}else{if(b!=""){c.attr("value",b)}else{c.attr("value","?")}}};c.parents("form").submit(function(){d()});$(document).bind("convert_dbkeys",function(){d()})})}function async_save_text(d,f,e,a,c,h,i,g,b){if(c===undefined){c=30}if(i===undefined){i=4}$("#"+d).live("click",function(){if($("#renaming-active").length>0){return}var l=$("#"+f),k=l.text(),j;if(h){j=$("<textarea></textarea>").attr({rows:i,cols:c}).text(k)}else{j=$("<input type='text'></input>").attr({value:k,size:c})}j.attr("id","renaming-active");j.blur(function(){$(this).remove();l.show();if(b){b(j)}});j.keyup(function(n){if(n.keyCode===27){$(this!
).trigger("blur")}else{if(n.keyCode===13){var m={};m[a]=$(this).val();
$(this).trigger("blur");$.ajax({url:e,data:m,error:function(){alert("Text editing for elt "+f+" failed")},success:function(o){l.text(o);if(b){b(j)}}})}}});if(g){g(j)}l.hide();j.insertAfter(l);j.focus();j.select();return})}function init_history_items(d,a,c){var b=function(){try{var e=$.jStore.store("history_expand_state");if(e){for(var g in e){$("#"+g+" div.historyItemBody").show()}}}catch(f){$.jStore.remove("history_expand_state")}if($.browser.mozilla){$("div.historyItemBody").each(function(){if(!$(this).is(":visible")){$(this).find("pre.peek").css("overflow","hidden")}})}d.each(function(){var j=this.id;var h=$(this).children("div.historyItemBody");var i=h.find("pre.peek");$(this).find(".historyItemTitleBar > .historyItemTitle").wrap("<a href='javascript:void();'></a>").click(function(){if(h.is(":visible")){if($.browser.mozilla){i.css("overflow","hidden")}h.slideUp("fast");if(!c){var k=$.jStore.store("history_expand_state");if(k){delete k[j];$.jStore.store("history_expand_st!
ate",k)}}}else{h.slideDown("fast",function(){if($.browser.mozilla){i.css("overflow","auto")}});if(!c){var k=$.jStore.store("history_expand_state");if(k===undefined){k={}}k[j]=true;$.jStore.store("history_expand_state",k)}}return false})});$("#top-links > a.toggle").click(function(){var h=$.jStore.store("history_expand_state");if(h===undefined){h={}}$("div.historyItemBody:visible").each(function(){if($.browser.mozilla){$(this).find("pre.peek").css("overflow","hidden")}$(this).slideUp("fast");if(h){delete h[$(this).parent().attr("id")]}});$.jStore.store("history_expand_state",h)}).show()};if(a){b()}else{$.jStore.init("galaxy");$.jStore.engineReady(function(){b()})}}$(document).ready(function(){$("a[confirm]").click(function(){return confirm($(this).attr("confirm"))});if($.fn.tipsy){$(".tooltip").tipsy({gravity:"s"})}make_popup_menus()});
\ No newline at end of file
+$(document).ready(function(){replace_big_select_inputs()});$.fn.makeAbsolute=function(a){return this.each(function(){var b=$(this);var c=b.position();b.css({position:"absolute",marginLeft:0,marginTop:0,top:c.top,left:c.left,right:$(window).width()-(c.left+b.width())});if(a){b.remove().appendTo("body")}})};function ensure_popup_helper(){if($("#popup-helper").length===0){$("<div id='popup-helper'/>").css({background:"white",opacity:0,zIndex:15000,position:"absolute",top:0,left:0,width:"100%",height:"100%"}).appendTo("body").hide()}}function attach_popupmenu(b,d){var a=function(){d.unbind().hide();$("#popup-helper").unbind("click.popupmenu").hide()};var c=function(g){$("#popup-helper").bind("click.popupmenu",a).show();d.click(a).css({left:0,top:-1000}).show();var f=g.pageX-d.width()/2;f=Math.min(f,$(document).scrollLeft()+$(window).width()-$(d).width()-20);f=Math.max(f,$(document).scrollLeft()+20);d.css({top:g.pageY-5,left:f});return false};$(b).click(c)}function make_popupmen!
u(c,b){ensure_popup_helper();var a=$("<ul id='"+c.attr("id")+"-menu'></ul>");$.each(b,function(f,e){if(e){$("<li/>").html(f).click(e).appendTo(a)}else{$("<li class='head'/>").html(f).appendTo(a)}});var d=$("<div class='popmenu-wrapper'>");d.append(a).append("<div class='overlay-border'>").css("position","absolute").appendTo("body").hide();attach_popupmenu(c,d)}function make_popup_menus(){jQuery("div[popupmenu]").each(function(){var c={};$(this).find("a").each(function(){var b=$(this).attr("confirm"),d=$(this).attr("href"),e=$(this).attr("target");c[$(this).text()]=function(){if(!b||confirm(b)){var g=window;if(e=="_parent"){g=window.parent}else{if(e=="_top"){g=window.top}}g.location=d}}});var a=$("#"+$(this).attr("popupmenu"));make_popupmenu(a,c);$(this).remove();a.addClass("popup").show()})}function array_length(b){if(b.length){return b.length}var c=0;for(var a in b){c++}return c}function naturalSort(i,g){var n=/(-?[0-9\.]+)/g,j=i.toString().toLowerCase()||"",f=g.toString()!
.toLowerCase()||"",k=String.fromCharCode(0),l=j.replace(n,k+"$1"+k).sp
lit(k),e=f.replace(n,k+"$1"+k).split(k),d=(new Date(j)).getTime(),m=d?(new Date(f)).getTime():null;if(m){if(d<m){return -1}else{if(d>m){return 1}}}for(var h=0,c=Math.max(l.length,e.length);h<c;h++){oFxNcL=parseFloat(l[h])||l[h];oFyNcL=parseFloat(e[h])||e[h];if(oFxNcL<oFyNcL){return -1}else{if(oFxNcL>oFyNcL){return 1}}}return 0}function replace_big_select_inputs(a){$("select[name=dbkey]").each(function(){var b=$(this);if(a!==undefined&&b.find("option").length<a){return}var c=b.attr("value");var d=$("<input type='text' class='text-and-autocomplete-select'></input>");d.attr("size",40);d.attr("name",b.attr("name"));d.attr("id",b.attr("id"));d.click(function(){var i=$(this).attr("value");$(this).attr("value","Loading...");$(this).showAllInCache();$(this).attr("value",i);$(this).select()});var h=[];var g={};b.children("option").each(function(){var j=$(this).text();var i=$(this).attr("value");if(i=="?"){return}h.push(j);g[j]=i;g[i]=i;if(i==c){d.attr("value",j)}});h.push("unspecifie!
d (?)");g["unspecified (?)"]="?";g["?"]="?";if(d.attr("value")==""){d.attr("value","Click to Search or Select")}h=h.sort(naturalSort);var f={selectFirst:false,autoFill:false,mustMatch:false,matchContains:true,max:1000,minChars:0,hideForLessThanMinChars:false};d.autocomplete(h,f);b.replaceWith(d);var e=function(){var j=d.attr("value");var i=g[j];if(i!==null&&i!==undefined){d.attr("value",i)}else{if(c!=""){d.attr("value",c)}else{d.attr("value","?")}}};d.parents("form").submit(function(){e()});$(document).bind("convert_dbkeys",function(){e()})})}function async_save_text(d,f,e,a,c,h,i,g,b){if(c===undefined){c=30}if(i===undefined){i=4}$("#"+d).live("click",function(){if($("#renaming-active").length>0){return}var l=$("#"+f),k=l.text(),j;if(h){j=$("<textarea></textarea>").attr({rows:i,cols:c}).text(k)}else{j=$("<input type='text'></input>").attr({value:k,size:c})}j.attr("id","renaming-active");j.blur(function(){$(this).remove();l.show();if(b){b(j)}});j.keyup(function(n){if(n.keyCo!
de===27){$(this).trigger("blur")}else{if(n.keyCode===13){var m={};m[a]
=$(this).val();$(this).trigger("blur");$.ajax({url:e,data:m,error:function(){alert("Text editing for elt "+f+" failed")},success:function(o){l.text(o);if(b){b(j)}}})}}});if(g){g(j)}l.hide();j.insertAfter(l);j.focus();j.select();return})}function init_history_items(d,a,c){var b=function(){try{var e=$.jStore.store("history_expand_state");if(e){for(var g in e){$("#"+g+" div.historyItemBody").show()}}}catch(f){$.jStore.remove("history_expand_state")}if($.browser.mozilla){$("div.historyItemBody").each(function(){if(!$(this).is(":visible")){$(this).find("pre.peek").css("overflow","hidden")}})}d.each(function(){var j=this.id;var h=$(this).children("div.historyItemBody");var i=h.find("pre.peek");$(this).find(".historyItemTitleBar > .historyItemTitle").wrap("<a href='javascript:void();'></a>").click(function(){if(h.is(":visible")){if($.browser.mozilla){i.css("overflow","hidden")}h.slideUp("fast");if(!c){var k=$.jStore.store("history_expand_state");if(k){delete k[j];$.jStore.store("hi!
story_expand_state",k)}}}else{h.slideDown("fast",function(){if($.browser.mozilla){i.css("overflow","auto")}});if(!c){var k=$.jStore.store("history_expand_state");if(k===undefined){k={}}k[j]=true;$.jStore.store("history_expand_state",k)}}return false})});$("#top-links > a.toggle").click(function(){var h=$.jStore.store("history_expand_state");if(h===undefined){h={}}$("div.historyItemBody:visible").each(function(){if($.browser.mozilla){$(this).find("pre.peek").css("overflow","hidden")}$(this).slideUp("fast");if(h){delete h[$(this).parent().attr("id")]}});$.jStore.store("history_expand_state",h)}).show()};if(a){b()}else{$.jStore.init("galaxy");$.jStore.engineReady(function(){b()})}}$(document).ready(function(){$("a[confirm]").click(function(){return confirm($(this).attr("confirm"))});if($.fn.tipsy){$(".tooltip").tipsy({gravity:"s"})}make_popup_menus()});
\ No newline at end of file
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/3c6ffa5362d2
changeset: 3667:3c6ffa5362d2
user: Kanwei Li <kanwei(a)gmail.com>
date: Mon Apr 19 14:09:07 2010 -0400
description:
Fix autocomplete for Edit Attributes dbkeys
diffstat:
templates/dataset/edit_attributes.mako | 4 ++--
1 files changed, 2 insertions(+), 2 deletions(-)
diffs (18 lines):
diff -r af004e0932b7 -r 3c6ffa5362d2 templates/dataset/edit_attributes.mako
--- a/templates/dataset/edit_attributes.mako Mon Apr 19 11:40:20 2010 -0400
+++ b/templates/dataset/edit_attributes.mako Mon Apr 19 14:09:07 2010 -0400
@@ -4,12 +4,12 @@
<%def name="title()">${_('Edit Dataset Attributes')}</%def>
<%def name="stylesheets()">
- ${h.css( "base" )}
+ ${h.css( "base", "autocomplete_tagging" )}
</%def>
<%def name="javascripts()">
${parent.javascripts()}
- ${h.js( "galaxy.base" )}
+ ${h.js( "galaxy.base", "jquery.autocomplete", "autocomplete_tagging" )}
</%def>
<%def name="datatype( dataset, datatypes )">
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/af004e0932b7
changeset: 3666:af004e0932b7
user: Nate Coraor <nate(a)bx.psu.edu>
date: Mon Apr 19 11:40:20 2010 -0400
description:
Add column join tool to main tool conf
diffstat:
tool_conf.xml.main | 1 +
1 files changed, 1 insertions(+), 0 deletions(-)
diffs (11 lines):
diff -r 239fb5cf4e37 -r af004e0932b7 tool_conf.xml.main
--- a/tool_conf.xml.main Mon Apr 19 11:14:19 2010 -0400
+++ b/tool_conf.xml.main Mon Apr 19 11:40:20 2010 -0400
@@ -70,6 +70,7 @@
<tool file="filters/compare.xml"/>
<tool file="new_operations/subtract_query.xml"/>
<tool file="stats/grouping.xml" />
+ <tool file="new_operations/column_join.xml"/>
</section>
<section name="Extract Features" id="features">
<tool file="filters/ucsc_gene_bed_to_exon_bed.xml" />
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/d3ff52561d78
changeset: 3664:d3ff52561d78
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Mon Apr 19 11:08:25 2010 -0400
description:
Complete Cufflinks wrapper.
diffstat:
tools/ngs_rna/cufflinks_wrapper.py | 64 +++++++++++++++++++-------
tools/ngs_rna/cufflinks_wrapper.xml | 88 ++++++++++++++++++++++++++++++++++--
2 files changed, 128 insertions(+), 24 deletions(-)
diffs (244 lines):
diff -r efd404f7a60b -r d3ff52561d78 tools/ngs_rna/cufflinks_wrapper.py
--- a/tools/ngs_rna/cufflinks_wrapper.py Fri Apr 16 18:46:35 2010 -0400
+++ b/tools/ngs_rna/cufflinks_wrapper.py Mon Apr 19 11:08:25 2010 -0400
@@ -10,20 +10,20 @@
#Parse Command Line
parser = optparse.OptionParser()
parser.add_option( '-1', '--input', dest='input', help=' file of RNA-Seq read alignments in the SAM format. SAM is a standard short read alignment, that allows aligners to attach custom tags to individual alignments, and Cufflinks requires that the alignments you supply have some of these tags. Please see Input formats for more details.' )
- parser.add_option( '-s', '--inner-dist-std-dev', help='The standard deviation for the distribution on inner distances between mate pairs. The default is 20bp.' )
- parser.add_option( '-I', '--max-intron-length', help='The minimum intron length. Cufflinks will not report transcripts with introns longer than this, and will ignore SAM alignments with REF_SKIP CIGAR operations longer than this. The default is 300,000.' )
- parser.add_option( '-F', '--min-isoform-fraction', help='After calculating isoform abundance for a gene, Cufflinks filters out transcripts that it believes are very low abundance, because isoforms expressed at extremely low levels often cannot reliably be assembled, and may even be artifacts of incompletely spliced precursors of processed transcripts. This parameter is also used to filter out introns that have far fewer spliced alignments supporting them. The default is 0.05, or 5% of the most abundant isoform (the major isoform) of the gene.' )
- parser.add_option( '-j', '--pre-mrna-fraction', help='Some RNA-Seq protocols produce a significant amount of reads that originate from incompletely spliced transcripts, and these reads can confound the assembly of fully spliced mRNAs. Cufflinks uses this parameter to filter out alignments that lie within the intronic intervals implied by the spliced alignments. The minimum depth of coverage in the intronic region covered by the alignment is divided by the number of spliced reads, and if the result is lower than this parameter value, the intronic alignments are ignored. The default is 5%.' )
- parser.add_option( '-p', '--num-threads', help='Use this many threads to align reads. The default is 1.' )
+ parser.add_option( '-s', '--inner-dist-std-dev', dest='inner_dist_std_dev', help='The standard deviation for the distribution on inner distances between mate pairs. The default is 20bp.' )
+ parser.add_option( '-I', '--max-intron-length', dest='max_intron_len', help='The minimum intron length. Cufflinks will not report transcripts with introns longer than this, and will ignore SAM alignments with REF_SKIP CIGAR operations longer than this. The default is 300,000.' )
+ parser.add_option( '-F', '--min-isoform-fraction', dest='min_isoform_fraction', help='After calculating isoform abundance for a gene, Cufflinks filters out transcripts that it believes are very low abundance, because isoforms expressed at extremely low levels often cannot reliably be assembled, and may even be artifacts of incompletely spliced precursors of processed transcripts. This parameter is also used to filter out introns that have far fewer spliced alignments supporting them. The default is 0.05, or 5% of the most abundant isoform (the major isoform) of the gene.' )
+ parser.add_option( '-j', '--pre-mrna-fraction', dest='pre_mrna_fraction', help='Some RNA-Seq protocols produce a significant amount of reads that originate from incompletely spliced transcripts, and these reads can confound the assembly of fully spliced mRNAs. Cufflinks uses this parameter to filter out alignments that lie within the intronic intervals implied by the spliced alignments. The minimum depth of coverage in the intronic region covered by the alignment is divided by the number of spliced reads, and if the result is lower than this parameter value, the intronic alignments are ignored. The default is 5%.' )
+ parser.add_option( '-p', '--num-threads', dest='num_threads', help='Use this many threads to align reads. The default is 1.' )
parser.add_option( '-m', '--inner-mean-dist', dest='inner_mean_dist', help='This is the expected (mean) inner distance between mate pairs. \
For, example, for paired end runs with fragments selected at 300bp, \
where each end is 50bp, you should set -r to be 200. The default is 45bp.')
- parser.add_option( '-Q', '--min-mapqual', help='Instructs Cufflinks to ignore alignments with a SAM mapping quality lower than this number. The default is 0.' )
- parser.add_option( '-L', '--label', help='Cufflinks will report transfrags in GTF format, with a prefix given by this option. The default prefix is "CUFF".' )
- parser.add_option( '-G', '--GTF', help='Tells Cufflinks to use the supplied reference annotation to estimate isoform expression. It will not assemble novel transcripts, and the program will ignore alignments not structurally compatible with any reference transcript.' )
+ parser.add_option( '-Q', '--min-mapqual', dest='min_mapqual', help='Instructs Cufflinks to ignore alignments with a SAM mapping quality lower than this number. The default is 0.' )
+ parser.add_option( '-G', '--GTF', dest='GTF', help='Tells Cufflinks to use the supplied reference annotation to estimate isoform expression. It will not assemble novel transcripts, and the program will ignore alignments not structurally compatible with any reference transcript.' )
+
# Advanced Options:
- parser.add_option( '--num-importance-samples', help='Sets the number of importance samples generated for each locus during abundance estimation. Default: 1000' )
- parser.add_option( '--max-mle-iterations', help='Sets the number of iterations allowed during maximum likelihood estimation of abundances. Default: 5000' )
+ parser.add_option( '--num-importance-samples', dest='num_importance_samples', help='Sets the number of importance samples generated for each locus during abundance estimation. Default: 1000' )
+ parser.add_option( '--max-mle-iterations', dest='max_mle_iterations', help='Sets the number of iterations allowed during maximum likelihood estimation of abundances. Default: 5000' )
# Wrapper / Galaxy options.
parser.add_option( '-A', '--assembled-isoforms-output', dest='assembled_isoforms_output_file', help='Assembled isoforms output file; formate is GTF.' )
@@ -41,31 +41,61 @@
cmd = "cufflinks"
# Add options.
+ if options.inner_dist_std_dev:
+ cmd += ( " -s %i" % int ( options.inner_dist_std_dev ) )
+ if options.max_intron_len:
+ cmd += ( " -I %i" % int ( options.max_intron_len ) )
+ if options.min_isoform_fraction:
+ cmd += ( " -F %f" % float ( options.min_isoform_fraction ) )
+ if options.pre_mrna_fraction:
+ cmd += ( " -j %f" % float ( options.pre_mrna_fraction ) )
+ if options.num_threads:
+ cmd += ( " -p %i" % int ( options.num_threads ) )
if options.inner_mean_dist:
cmd += ( " -m %i" % int ( options.inner_mean_dist ) )
+ if options.min_mapqual:
+ cmd += ( " -Q %i" % int ( options.min_mapqual ) )
+ if options.GTF:
+ cmd += ( " -G %i" % options.GTF )
+ if options.num_importance_samples:
+ cmd += ( " --num-importance-samples %i" % int ( options.num_importance_samples ) )
+ if options.max_mle_iterations:
+ cmd += ( " --max-mle-iterations %i" % int ( options.max_mle_iterations ) )
# Add input files.
cmd += " " + options.input
-
- # Run
+ print cmd
+
+ # Run command.
try:
- proc = subprocess.Popen( args=cmd, shell=True, cwd=tmp_output_dir, stdout=subprocess.PIPE, stderr=subprocess.PIPE )
+ tmp_name = tempfile.NamedTemporaryFile( dir=tmp_output_dir ).name
+ tmp_stderr = open( tmp_name, 'wb' )
+ proc = subprocess.Popen( args=cmd, shell=True, cwd=tmp_output_dir, stderr=tmp_stderr.fileno() )
returncode = proc.wait()
+ tmp_stderr.close()
+
+ # Get stderr, allowing for case where it's very large.
+ tmp_stderr = open( tmp_name, 'rb' )
stderr = ''
buffsize = 1048576
try:
while True:
- stderr += proc.stderr.read( buffsize )
+ stderr += tmp_stderr.read( buffsize )
if not stderr or len( stderr ) % buffsize != 0:
break
except OverflowError:
pass
+ tmp_stderr.close()
+
+ # Error checking.
if returncode != 0:
raise Exception, stderr
+
+ # check that there are results in the output file
+ if len( open( tmp_output_dir + "/transcripts.gtf", 'rb' ).read().strip() ) == 0:
+ raise Exception, 'The main output file is empty, there may be an error with your input file or settings.'
except Exception, e:
- stop_err( 'Error in cufflinks:\n' + str( e ) )
-
- # TODO: look for errors in program output.
+ stop_err( 'Error running cufflinks. ' + str( e ) )
# Copy output files from tmp directory to specified files.
try:
diff -r efd404f7a60b -r d3ff52561d78 tools/ngs_rna/cufflinks_wrapper.xml
--- a/tools/ngs_rna/cufflinks_wrapper.xml Fri Apr 16 18:46:35 2010 -0400
+++ b/tools/ngs_rna/cufflinks_wrapper.xml Mon Apr 19 11:08:25 2010 -0400
@@ -1,5 +1,5 @@
<tool id="cufflinks" name="Cufflinks" version="0.8.2">
- <description>Transcript assembly, differential expression, and differential regulation for RNA-Seq</description>
+ <description>transcript assembly, differential expression, and differential regulation for RNA-Seq</description>
<command interpreter="python">
cufflinks_wrapper.py
--input=$input
@@ -7,24 +7,46 @@
--transcripts-expression-output=$transcripts_expression
--genes-expression-output=$genes_expression
--num-threads="4"
+ -I $max_intron_len
+ -F $min_isoform_fraction
+ -j $pre_mrna_fraction
+ -Q $min_map_quality
+ #if $reference_annotation.use_ref == "Yes":
+ -G $reference_annotation.reference_annotation_file
+ #end if
#if $singlePaired.sPaired == "paired":
-r $singlePaired.mean_inner_distance
+ -s $singlePaired.inner_distance_std_dev
#end if
</command>
<inputs>
<param format="sam" name="input" type="data" label="SAM file of aligned RNA-Seq reads" help=""/>
+ <param name="max_intron_len" type="integer" value="300000" label="Max Intron Length" help=""/>
+ <param name="min_isoform_fraction" type="float" value="0.05" label="Min Isoform Fraction" help=""/>
+ <param name="pre_mrna_fraction" type="float" value="0.05" label="Pre MRNA Fraction" help=""/>
+ <param name="min_map_quality" type="integer" value="0" label="Min SAM Map Quality" help=""/>
+ <conditional name="reference_annotation">
+ <param name="use_ref" type="select" label="Use Reference Annotation?">
+ <option value="No">No</option>
+ <option value="Yes">Yes</option>
+ </param>
+ <when value="No"></when>
+ <when value="Yes">
+ <param format="gtf" name="reference_annotation_file" type="data" label="Reference Annotation" help=""/>
+ </when>
+ </conditional>
<conditional name="singlePaired">
<param name="sPaired" type="select" label="Is this library mate-paired?">
<option value="single">Single-end</option>
<option value="paired">Paired-end</option>
</param>
- <when value="single">
-
- </when>
+ <when value="single"></when>
<when value="paired">
<param name="mean_inner_distance" type="integer" value="20" label="Mean Inner Distance between Mate Pairs"/>
+ <param name="inner_distance_std_dev" type="integer" value="20" label="Standard Deviation for Inner Distance between Mate Pairs"/>
</when>
</conditional>
+
</inputs>
<outputs>
@@ -67,19 +89,64 @@
**Input formats**
-Cufflinks accepts files in SAM format.
+Cufflinks takes a text file of SAM alignments as input. The RNA-Seq read mapper TopHat produces output in this format, and is recommended for use with Cufflinks. However Cufflinks will accept SAM alignments generated by any read mapper. Here's an example of an alignment Cufflinks will accept::
+
+ s6.25mer.txt-913508 16 chr1 4482736 255 14M431N11M * 0 0 \
+ CAAGATGCTAGGCAAGTCTTGGAAG IIIIIIIIIIIIIIIIIIIIIIIII NM:i:0 XS:A:-
+
+Note the use of the custom tag XS. This attribute, which must have a value of "+" or "-", indicates which strand the RNA that produced this read came from. While this tag can be applied to any alignment, including unspliced ones, it must be present for all spliced alignment records (those with a 'N' operation in the CIGAR string).
+The SAM file supplied to Cufflinks must be sorted by reference position. If you aligned your reads with TopHat, your alignments will be properly sorted already. If you used another tool, you may want to make sure they are properly sorted as follows::
+
+ sort -k 3,3 -k 4,4n hits.sam > hits.sam.sorted
+
+NOTE: Cufflinks currently only supports SAM alignments with the CIGAR match ('M') and reference skip ('N') operations. Support for the other operations, such as insertions, deletions, and clipping, will be added in the future.
------
**Outputs**
-TODO
+Cufflinks produces three output files:
+
+Transcripts and Genes:
+
+This GTF file contains Cufflinks' assembled isoforms. The first 7 columns are standard GTF, and the last column contains attributes, some of which are also standardized (e.g. gene_id, transcript_id). There one GTF record per row, and each record represents either a transcript or an exon within a transcript. The columns are defined as follows::
+
+ Column number Column name Example Description
+ -----------------------------------------------------
+ 1 seqname chrX Chromosome or contig name
+ 2 source Cufflinks The name of the program that generated this file (always 'Cufflinks')
+ 3 feature exon The type of record (always either "transcript" or "exon").
+ 4 start 77696957 The leftmost coordinate of this record (where 0 is the leftmost possible coordinate)
+ 5 end 77712009 The rightmost coordinate of this record, inclusive.
+ 6 score 77712009 The most abundant isoform for each gene is assigned a score of 1000. Minor isoforms are scored by the ratio (minor FPKM/major FPKM)
+ 7 strand + Cufflinks' guess for which strand the isoform came from. Always one of '+', '-' '.'
+ 7 frame . Cufflinks does not predict where the start and stop codons (if any) are located within each transcript, so this field is not used.
+ 8 attributes See below
+
+Each GTF record is decorated with the following attributes::
+
+ Attribute Example Description
+ -----------------------------------------
+ gene_id CUFF.1 Cufflinks gene id
+ transcript_id CUFF.1.1 Cufflinks transcript id
+ FPKM 101.267 Isoform-level relative abundance in Reads Per Kilobase of exon model per Million mapped reads
+ frac 0.7647 Reserved. Please ignore, as this attribute may be deprecated in the future
+ conf_lo 0.07 Lower bound of the 95% confidence interval of the abundance of this isoform, as a fraction of the isoform abundance. That is, lower bound = FPKM * (1.0 - conf_lo)
+ conf_hi 0.1102 Upper bound of the 95% confidence interval of the abundance of this isoform, as a fraction of the isoform abundance. That is, upper bound = FPKM * (1.0 + conf_lo)
+ cov 100.765 Estimate for the absolute depth of read coverage across the whole transcript
+
+
+Transcripts only:
+ This file is simply a tab delimited file containing one row per transcript and with columns containing the attributes above. There are a few additional attributes not in the table above, but these are reserved for debugging, and may change or disappear in the future.
+
+Genes only:
+This file contains gene-level coordinates and expression values.
-------
**Cufflinks settings**
-All of the options have a default value. You can change any of them. Some of the options in Cufflinks have been implemented here.
+All of the options have a default value. You can change any of them. Most of the options in Cufflinks have been implemented here.
------
@@ -87,5 +154,12 @@
This is a list of implemented Cufflinks options::
+ -m INT This is the expected (mean) inner distance between mate pairs. For, example, for paired end runs with fragments selected at 300bp, where each end is 50bp, you should set -r to be 200. The default is 45bp.
+ -s INT The standard deviation for the distribution on inner distances between mate pairs. The default is 20bp.
+ -I INT The minimum intron length. Cufflinks will not report transcripts with introns longer than this, and will ignore SAM alignments with REF_SKIP CIGAR operations longer than this. The default is 300,000.
+ -F After calculating isoform abundance for a gene, Cufflinks filters out transcripts that it believes are very low abundance, because isoforms expressed at extremely low levels often cannot reliably be assembled, and may even be artifacts of incompletely spliced precursors of processed transcripts. This parameter is also used to filter out introns that have far fewer spliced alignments supporting them. The default is 0.05, or 5% of the most abundant isoform (the major isoform) of the gene.
+ -j Some RNA-Seq protocols produce a significant amount of reads that originate from incompletely spliced transcripts, and these reads can confound the assembly of fully spliced mRNAs. Cufflinks uses this parameter to filter out alignments that lie within the intronic intervals implied by the spliced alignments. The minimum depth of coverage in the intronic region covered by the alignment is divided by the number of spliced reads, and if the result is lower than this parameter value, the intronic alignments are ignored. The default is 5%.
+ -Q Instructs Cufflinks to ignore alignments with a SAM mapping quality lower than this number. The default is 0.
+ -G Tells Cufflinks to use the supplied reference annotation to estimate isoform expression. It will not assemble novel transcripts, and the program will ignore alignments not structurally compatible with any reference transcript.
</help>
</tool>
1
0