galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
March 2011
- 1 participants
- 141 discussions
commit/galaxy-central: kellyv: Added two more builds to the manual builds list
by Bitbucket 10 Mar '11
by Bitbucket 10 Mar '11
10 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/0c5d60534ea8/
changeset: r5209:0c5d60534ea8
user: kellyv
date: 2011-03-10 22:08:39
summary: Added two more builds to the manual builds list
affected #: 1 file (233 bytes)
--- a/tool-data/shared/ucsc/manual_builds.txt Thu Mar 10 15:48:00 2011 -0500
+++ b/tool-data/shared/ucsc/manual_builds.txt Thu Mar 10 16:08:39 2011 -0500
@@ -696,3 +696,5 @@
Spur_v2.6 Purple Sea Urchin (Strongylocentrotus purpuratus) v2.6
Ptrichocarpa_156 Poplar (Populus trichocarpa)
Hydra_JCVI Hydra magnipapillata str. 105
+Araly1 Arabidopsis lyrata
+Zea_mays_B73_RefGen_v2 Maize (Zea mays) chr1=301354135,chr2=237068928,chr3=232140222,chr4=241473566,chr5=217872898,chr6=169174371,chr7=176764813,chr8=175793772,chr9=156750718,chr10=150189513,chr11=7140224
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Add a checkbox to the Create Group page that if checked will create a new Role with the same name. This provides a similar feature the the existing checkbox on the Create Role page.
by Bitbucket 10 Mar '11
by Bitbucket 10 Mar '11
10 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/8c04153e4cf6/
changeset: r5208:8c04153e4cf6
user: greg
date: 2011-03-10 21:48:00
summary: Add a checkbox to the Create Group page that if checked will create a new Role with the same name. This provides a similar feature the the existing checkbox on the Create Role page.
However, the behavior is now changed such that new associations are created when the checkbox is checked whereas before, only the Group or Role objects with the same name were created, but not associated with anything.
This code was very old, so I did some cleanup / improvements as well. Functional tests enhanced to reflect changes.
affected #: 6 files (3.9 KB)
--- a/lib/galaxy/web/base/controller.py Thu Mar 10 15:20:40 2011 -0500
+++ b/lib/galaxy/web/base/controller.py Thu Mar 10 15:48:00 2011 -0500
@@ -1185,16 +1185,24 @@
webapp = params.get( 'webapp', 'galaxy' )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
+ name = util.restore_text( params.get( 'name', '' ) )
+ description = util.restore_text( params.get( 'description', '' ) )
+ in_users = util.listify( params.get( 'in_users', [] ) )
+ out_users = util.listify( params.get( 'out_users', [] ) )
+ in_groups = util.listify( params.get( 'in_groups', [] ) )
+ out_groups = util.listify( params.get( 'out_groups', [] ) )
+ create_group_for_role = params.get( 'create_group_for_role', '' )
+ create_group_for_role_checked = CheckboxField.is_checked( create_group_for_role )
+ ok = True
if params.get( 'create_role_button', False ):
- name = util.restore_text( params.name )
- description = util.restore_text( params.description )
- in_users = util.listify( params.get( 'in_users', [] ) )
- in_groups = util.listify( params.get( 'in_groups', [] ) )
- create_group_for_role = params.get( 'create_group_for_role', 'no' )
if not name or not description:
- message = "Enter a valid name and a description"
+ message = "Enter a valid name and a description."
+ status = 'error'
+ ok = False
elif trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.name==name ).first():
- message = "A role with that name already exists"
+ message = "Role names must be unique and a role with that name already exists, so choose another name."
+ status = 'error'
+ ok = False
else:
# Create the role
role = trans.app.model.Role( name=name, description=description, type=trans.app.model.Role.types.ADMIN )
@@ -1207,41 +1215,44 @@
for group in [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in in_groups ]:
gra = trans.app.model.GroupRoleAssociation( group, role )
trans.sa_session.add( gra )
- if create_group_for_role == 'yes':
+ if create_group_for_role_checked:
# Create the group
group = trans.app.model.Group( name=name )
trans.sa_session.add( group )
- message = "Group '%s' has been created, and role '%s' has been created with %d associated users and %d associated groups" % \
- ( group.name, role.name, len( in_users ), len( in_groups ) )
+ # Associate the group with the role
+ gra = trans.model.GroupRoleAssociation( group, role )
+ trans.sa_session.add( gra )
+ num_in_groups = len( in_groups ) + 1
else:
- message = "Role '%s' has been created with %d associated users and %d associated groups" % ( role.name, len( in_users ), len( in_groups ) )
+ num_in_groups = len( in_groups )
trans.sa_session.flush()
+ message = "Role '%s' has been created with %d associated users and %d associated groups. " \
+ % ( role.name, len( in_users ), num_in_groups )
+ if create_group_for_role_checked:
+ message += 'One of the groups associated with this role is the newly created group with the same name.'
trans.response.send_redirect( web.url_for( controller='admin',
action='roles',
webapp=webapp,
message=util.sanitize_text( message ),
status='done' ) )
- trans.response.send_redirect( web.url_for( controller='admin',
- action='create_role',
- webapp=webapp,
- message=util.sanitize_text( message ),
- status='error' ) )
- out_users = []
- for user in trans.sa_session.query( trans.app.model.User ) \
- .filter( trans.app.model.User.table.c.deleted==False ) \
- .order_by( trans.app.model.User.table.c.email ):
- out_users.append( ( user.id, user.email ) )
- out_groups = []
- for group in trans.sa_session.query( trans.app.model.Group ) \
- .filter( trans.app.model.Group.table.c.deleted==False ) \
- .order_by( trans.app.model.Group.table.c.name ):
- out_groups.append( ( group.id, group.name ) )
+ if ok:
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
+ out_users.append( ( user.id, user.email ) )
+ for group in trans.sa_session.query( trans.app.model.Group ) \
+ .filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name ):
+ out_groups.append( ( group.id, group.name ) )
return trans.fill_template( '/admin/dataset_security/role/role_create.mako',
- in_users=[],
+ webapp=webapp,
+ name=name,
+ description=description,
+ in_users=in_users,
out_users=out_users,
- in_groups=[],
+ in_groups=in_groups,
out_groups=out_groups,
- webapp=webapp,
+ create_group_for_role_checked=create_group_for_role_checked,
message=message,
status=status )
@web.expose
@@ -1617,14 +1628,23 @@
webapp = params.get( 'webapp', 'galaxy' )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
+ name = util.restore_text( params.get( 'name', '' ) )
+ in_users = util.listify( params.get( 'in_users', [] ) )
+ out_users = util.listify( params.get( 'out_users', [] ) )
+ in_roles = util.listify( params.get( 'in_roles', [] ) )
+ out_roles = util.listify( params.get( 'out_roles', [] ) )
+ create_role_for_group = params.get( 'create_role_for_group', '' )
+ create_role_for_group_checked = CheckboxField.is_checked( create_role_for_group )
+ ok = True
if params.get( 'create_group_button', False ):
- name = util.restore_text( params.name )
- in_users = util.listify( params.get( 'in_users', [] ) )
- in_roles = util.listify( params.get( 'in_roles', [] ) )
if not name:
- message = "Enter a valid name"
+ message = "Enter a valid name."
+ status = 'error'
+ ok = False
elif trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.name==name ).first():
- message = "A group with that name already exists"
+ message = "Group names must be unique and a group with that name already exists, so choose another name."
+ status = 'error'
+ ok = False
else:
# Create the group
group = trans.app.model.Group( name=name )
@@ -1634,39 +1654,49 @@
for user in [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in in_users ]:
uga = trans.app.model.UserGroupAssociation( user, group )
trans.sa_session.add( uga )
- trans.sa_session.flush()
# Create the GroupRoleAssociations
for role in [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in in_roles ]:
gra = trans.app.model.GroupRoleAssociation( group, role )
trans.sa_session.add( gra )
- trans.sa_session.flush()
- message = "Group '%s' has been created with %d associated users and %d associated roles" % ( name, len( in_users ), len( in_roles ) )
+ if create_role_for_group_checked:
+ # Create the role
+ role = trans.app.model.Role( name=name, description='Role for group %s' % name )
+ trans.sa_session.add( role )
+ # Associate the role with the group
+ gra = trans.model.GroupRoleAssociation( group, role )
+ trans.sa_session.add( gra )
+ num_in_roles = len( in_roles ) + 1
+ else:
+ num_in_roles = len( in_roles )
+ trans.sa_session.flush()
+ message = "Group '%s' has been created with %d associated users and %d associated roles. " \
+ % ( group.name, len( in_users ), num_in_roles )
+ if create_role_for_group_checked:
+ message += 'One of the roles associated with this group is the newly created role with the same name.'
trans.response.send_redirect( web.url_for( controller='admin',
action='groups',
webapp=webapp,
message=util.sanitize_text( message ),
status='done' ) )
- trans.response.send_redirect( web.url_for( controller='admin',
- action='create_group',
- webapp=webapp,
- message=util.sanitize_text( message ),
- status='error' ) )
- out_users = []
- for user in trans.sa_session.query( trans.app.model.User ) \
- .filter( trans.app.model.User.table.c.deleted==False ) \
- .order_by( trans.app.model.User.table.c.email ):
- out_users.append( ( user.id, user.email ) )
- out_roles = []
- for role in trans.sa_session.query( trans.app.model.Role ) \
- .filter( trans.app.model.Role.table.c.deleted==False ) \
- .order_by( trans.app.model.Role.table.c.name ):
- out_roles.append( ( role.id, role.name ) )
+
+
+ if ok:
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
+ out_users.append( ( user.id, user.email ) )
+ for role in trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name ):
+ out_roles.append( ( role.id, role.name ) )
return trans.fill_template( '/admin/dataset_security/group/group_create.mako',
- in_users=[],
+ webapp=webapp,
+ name=name,
+ in_users=in_users,
out_users=out_users,
- in_roles=[],
+ in_roles=in_roles,
out_roles=out_roles,
- webapp=webapp,
+ create_role_for_group_checked=create_role_for_group_checked,
message=message,
status=status )
@web.expose
--- a/lib/galaxy/web/controllers/library_common.py Thu Mar 10 15:20:40 2011 -0500
+++ b/lib/galaxy/web/controllers/library_common.py Thu Mar 10 15:48:00 2011 -0500
@@ -1986,13 +1986,7 @@
# We've been called from a menu option for a library dataset search result set
move_ldda_ids = util.listify( item_id )
if move_ldda_ids:
- # Checkboxes cause 2 copies of each id to be included in the request
move_ldda_ids = map( trans.security.decode_id, move_ldda_ids )
- unique_ldda_ids = []
- for ldda_id in move_ldda_ids:
- if ldda_id not in unique_ldda_ids:
- unique_ldda_ids.append( ldda_id )
- move_ldda_ids = unique_ldda_ids
elif item_type == 'folder':
move_folder_id = item_id
move_folder = trans.sa_session.query( trans.model.LibraryFolder ).get( trans.security.decode_id( move_folder_id ) )
--- a/templates/admin/dataset_security/group/group_create.mako Thu Mar 10 15:20:40 2011 -0500
+++ b/templates/admin/dataset_security/group/group_create.mako Thu Mar 10 15:48:00 2011 -0500
@@ -42,7 +42,12 @@
});
});
</script>
-
+
+<%
+ from galaxy.web.form_builder import CheckboxField
+ create_role_for_group_checkbox = CheckboxField( 'create_role_for_group' )
+%>
+
%if message:
${render_msg( message, status )}
%endif
@@ -54,7 +59,7 @@
<div class="form-row"><input name="webapp" type="hidden" value="${webapp}" size=40"/><label>Name:</label>
- <input name="name" type="textfield" value="" size=40"/>
+ <input name="name" type="textfield" value="${name}" size=40"/></div><div class="form-row"><div style="float: left; margin-right: 10px;">
@@ -81,6 +86,12 @@
</div></div><div class="form-row">
+ %if create_role_for_group_checked:
+ <% create_role_for_group_checkbox.checked = True %>
+ %endif
+ ${create_role_for_group_checkbox.get_html()} Create a new role of the same name for this group
+ </div>
+ <div class="form-row"><input type="submit" name="create_group_button" value="Save"/></div></form>
--- a/templates/admin/dataset_security/role/role_create.mako Thu Mar 10 15:20:40 2011 -0500
+++ b/templates/admin/dataset_security/role/role_create.mako Thu Mar 10 15:48:00 2011 -0500
@@ -19,30 +19,35 @@
</%def><script type="text/javascript">
-$().ready(function() {
- $('#groups_add_button').click(function() {
- return !$('#out_groups option:selected').remove().appendTo('#in_groups');
- });
- $('#groups_remove_button').click(function() {
- return !$('#in_groups option:selected').remove().appendTo('#out_groups');
- });
- $('#users_add_button').click(function() {
- return !$('#out_users option:selected').remove().appendTo('#in_users');
- });
- $('#users_remove_button').click(function() {
- return !$('#in_users option:selected').remove().appendTo('#out_users');
- });
- $('form#associate_role_group_user').submit(function() {
- $('#in_groups option').each(function(i) {
- $(this).attr("selected", "selected");
+ $().ready(function() {
+ $('#groups_add_button').click(function() {
+ return !$('#out_groups option:selected').remove().appendTo('#in_groups');
});
- $('#in_users option').each(function(i) {
- $(this).attr("selected", "selected");
+ $('#groups_remove_button').click(function() {
+ return !$('#in_groups option:selected').remove().appendTo('#out_groups');
+ });
+ $('#users_add_button').click(function() {
+ return !$('#out_users option:selected').remove().appendTo('#in_users');
+ });
+ $('#users_remove_button').click(function() {
+ return !$('#in_users option:selected').remove().appendTo('#out_users');
+ });
+ $('form#associate_role_group_user').submit(function() {
+ $('#in_groups option').each(function(i) {
+ $(this).attr("selected", "selected");
+ });
+ $('#in_users option').each(function(i) {
+ $(this).attr("selected", "selected");
+ });
});
});
-});
</script>
+<%
+ from galaxy.web.form_builder import CheckboxField
+ create_group_for_role_checkbox = CheckboxField( 'create_group_for_role' )
+%>
+
%if message:
${render_msg( message, status )}
%endif
@@ -54,11 +59,11 @@
<div class="form-row"><input name="webapp" type="hidden" value="${webapp}" size=40"/><label>Name:</label>
- <input name="name" type="textfield" value="" size=40"/>
+ <input name="name" type="textfield" value="${name}" size=40"/></div><div class="form-row"><label>Description:</label>
- <input name="description" type="textfield" value="" size=40"/>
+ <input name="description" type="textfield" value="${description}" size=40"/></div><div class="form-row"><div style="float: left; margin-right: 10px;">
@@ -85,7 +90,10 @@
</div></div><div class="form-row">
- <input type="checkbox" name="create_group_for_role" value="yes" />Create a new group of the same name for this role
+ %if create_group_for_role_checked:
+ <% create_group_for_role_checkbox.checked = True %>
+ %endif
+ ${create_group_for_role_checkbox.get_html()} Create a new group of the same name for this role
</div><div class="form-row"><input type="submit" name="create_role_button" value="Save"/>
--- a/test/base/twilltestcase.py Thu Mar 10 15:20:40 2011 -0500
+++ b/test/base/twilltestcase.py Thu Mar 10 15:48:00 2011 -0500
@@ -1229,25 +1229,22 @@
description="This is Role One",
in_user_ids=[],
in_group_ids=[],
- create_group_for_role='no',
- private_role='' ):
+ create_group_for_role='',
+ private_role='',
+ strings_displayed=[] ):
"""Create a new role"""
- url = "%s/admin/roles?operation=create&create_role_button=Save&name=%s&description=%s" % ( self.url, name.replace( ' ', '+' ), description.replace( ' ', '+' ) )
+ url = "%s/admin/roles?operation=create&create_role_button=Save&name=%s&description=%s" % \
+ ( self.url, name.replace( ' ', '+' ), description.replace( ' ', '+' ) )
if in_user_ids:
url += "&in_users=%s" % ','.join( in_user_ids )
if in_group_ids:
url += "&in_groups=%s" % ','.join( in_group_ids )
if create_group_for_role == 'yes':
- url += '&create_group_for_role=yes'
+ url += '&create_group_for_role=yes&create_group_for_role=yes'
self.home()
self.visit_url( url )
- if create_group_for_role == 'yes':
- check_str = "Group '%s' has been created, and role '%s' has been created with %d associated users and %d associated groups" % \
- ( name, name, len( in_user_ids ), len( in_group_ids ) )
- else:
- check_str = "Role '%s' has been created with %d associated users and %d associated groups" % \
- ( name, len( in_user_ids ), len( in_group_ids ) )
- self.check_page_for_string( check_str )
+ for check_str in strings_displayed:
+ self.check_page_for_string( check_str )
if private_role:
# Make sure no private roles are displayed
try:
@@ -1304,17 +1301,19 @@
self.home()
# Tests associated with groups
- def create_group( self, name='Group One', in_user_ids=[], in_role_ids=[] ):
+ def create_group( self, name='Group One', in_user_ids=[], in_role_ids=[], create_role_for_group='', strings_displayed=[] ):
"""Create a new group"""
url = "%s/admin/groups?operation=create&create_group_button=Save&name=%s" % ( self.url, name.replace( ' ', '+' ) )
if in_user_ids:
url += "&in_users=%s" % ','.join( in_user_ids )
if in_role_ids:
url += "&in_roles=%s" % ','.join( in_role_ids )
+ if create_role_for_group == 'yes':
+ url += '&create_role_for_group=yes&create_role_for_group=yes'
self.home()
self.visit_url( url )
- check_str = "Group '%s' has been created with %d associated users and %d associated roles" % ( name, len( in_user_ids ), len( in_role_ids ) )
- self.check_page_for_string( check_str )
+ for check_str in strings_displayed:
+ self.check_page_for_string( check_str )
self.home()
self.visit_url( "%s/admin/groups" % self.url )
self.check_page_for_string( name )
--- a/test/functional/test_admin_features.py Thu Mar 10 15:20:40 2011 -0500
+++ b/test/functional/test_admin_features.py Thu Mar 10 15:48:00 2011 -0500
@@ -126,21 +126,26 @@
# Logged in as admin_user
name = 'Role One'
description = "This is Role Ones description"
- user_ids=[ str( admin_user.id ), str( regular_user1.id ), str( regular_user3.id ) ]
+ in_user_ids = [ str( admin_user.id ), str( regular_user1.id ), str( regular_user3.id ) ]
+ in_group_ids = []
+ # Add 1 to the number of associated groups since we are creating a new one with the same name as the role
+ num_gras = len( in_group_ids ) + 1
self.create_role( name=name,
description=description,
- in_user_ids=user_ids,
- in_group_ids=[],
+ in_user_ids=in_user_ids,
+ in_group_ids=in_group_ids,
create_group_for_role='yes',
- private_role=admin_user.email )
+ private_role=admin_user.email,
+ strings_displayed=[ "Role '%s' has been created with %d associated users and %d associated groups." % ( name, len( in_user_ids ), num_gras ),
+ "One of the groups associated with this role is the newly created group with the same name." ] )
# Get the role object for later tests
global role_one
role_one = sa_session.query( galaxy.model.Role ).filter( galaxy.model.Role.table.c.name==name ).first()
assert role_one is not None, 'Problem retrieving role named "Role One" from the database'
# Make sure UserRoleAssociations are correct
- if len( role_one.users ) != len( user_ids ):
+ if len( role_one.users ) != len( in_user_ids ):
raise AssertionError( '%d UserRoleAssociations were created for role id %d when it was created ( should have been %d )' \
- % ( len( role_one.users ), role_one.id, len( user_ids ) ) )
+ % ( len( role_one.users ), role_one.id, len( in_user_ids ) ) )
# Each of the following users should now have 2 role associations, their private role and role_one
for user in [ admin_user, regular_user1, regular_user3 ]:
refresh( user )
@@ -162,29 +167,36 @@
# Reset the role back to the original name and description
self.rename_role( self.security.encode_id( role_one.id ), name=name, description=description )
def test_035_create_group( self ):
- """Testing creating new group with 3 members and 1 associated role, then renaming it"""
+ """Testing creating new group with 3 members and 2 associated roles, then renaming it"""
# Logged in as admin_user
name = "Group One's Name"
- user_ids=[ str( admin_user.id ), str( regular_user1.id ), str( regular_user3.id ) ]
- role_ids=[ str( role_one.id ) ]
- self.create_group( name=name, in_user_ids=user_ids, in_role_ids=role_ids )
+ in_user_ids = [ str( admin_user.id ), str( regular_user1.id ), str( regular_user3.id ) ]
+ in_role_ids = [ str( role_one.id ) ]
+ # The number of GroupRoleAssociations should be 2, role_one and the newly created role named 'Group One's Name'
+ num_gras = len( in_role_ids ) + 1
+ self.create_group( name=name,
+ in_user_ids=in_user_ids,
+ in_role_ids=in_role_ids,
+ create_role_for_group='yes',
+ strings_displayed=[ "Group '%s' has been created with %d associated users and %d associated roles." % ( name, len( in_user_ids ), num_gras ),
+ "One of the roles associated with this group is the newly created role with the same name." ] )
# Get the group object for later tests
global group_one
group_one = get_group_by_name( name )
assert group_one is not None, 'Problem retrieving group named "Group One" from the database'
# Make sure UserGroupAssociations are correct
- if len( group_one.users ) != len( user_ids ):
+ if len( group_one.users ) != len( in_user_ids ):
raise AssertionError( '%d UserGroupAssociations were created for group id %d when it was created ( should have been %d )' \
- % ( len( group_one.users ), group_one.id, len( user_ids ) ) )
+ % ( len( group_one.users ), group_one.id, len( in_user_ids ) ) )
# Each user should now have 1 group association, group_one
for user in [ admin_user, regular_user1, regular_user3 ]:
refresh( user )
if len( user.groups ) != 1:
raise AssertionError( '%d UserGroupAssociations are associated with user %s ( should be 1 )' % ( len( user.groups ), user.email ) )
# Make sure GroupRoleAssociations are correct
- if len( group_one.roles ) != len( role_ids ):
+ if len( group_one.roles ) != num_gras:
raise AssertionError( '%d GroupRoleAssociations were created for group id %d when it was created ( should have been %d )' \
- % ( len( group_one.roles ), group_one.id, len( role_ids ) ) )
+ % ( len( group_one.roles ), group_one.id, num_gras ) )
# Rename the group
rename = "Group One's been Renamed"
self.rename_group( self.security.encode_id( group_one.id ), name=rename, )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Allow composite datatype datasets to be populated in the Upload tool from files that were uploaded to the Galaxy FTP server.
by Bitbucket 10 Mar '11
by Bitbucket 10 Mar '11
10 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/72d560d3e7fd/
changeset: r5207:72d560d3e7fd
user: dan
date: 2011-03-10 21:20:40
summary: Allow composite datatype datasets to be populated in the Upload tool from files that were uploaded to the Galaxy FTP server.
affected #: 1 file (1.6 KB)
--- a/lib/galaxy/tools/parameters/grouping.py Thu Mar 10 13:28:33 2011 -0500
+++ b/lib/galaxy/tools/parameters/grouping.py Thu Mar 10 15:20:40 2011 -0500
@@ -245,6 +245,7 @@
def get_one_filename( context ):
data_file = context['file_data']
url_paste = context['url_paste']
+ ftp_files = context['ftp_files']
name = context.get( 'NAME', None )
info = context.get( 'INFO', None )
warnings = []
@@ -252,13 +253,34 @@
if context.get( 'space_to_tab', None ) not in [ "None", None, False ]:
space_to_tab = True
file_bunch = get_data_file_filename( data_file, override_name = name, override_info = info )
- if file_bunch.path and url_paste:
- if url_paste.strip():
+ if file_bunch.path:
+ if url_paste is not None and url_paste.strip():
warnings.append( "All file contents specified in the paste box were ignored." )
- else: #we need to use url_paste
+ if ftp_files:
+ warnings.append( "All FTP uploaded file selections were ignored." )
+ elif url_paste is not None and url_paste.strip(): #we need to use url_paste
for file_bunch in get_url_paste_urls_or_filename( context, override_name = name, override_info = info ):
if file_bunch.path:
break
+ if file_bunch.path and ftp_files is not None:
+ warnings.append( "All FTP uploaded file selections were ignored." )
+ elif ftp_files is not None and trans.user is not None: # look for files uploaded via FTP
+ user_ftp_dir = os.path.join( trans.app.config.ftp_upload_dir, trans.user.email )
+ for ( dirpath, dirnames, filenames ) in os.walk( user_ftp_dir ):
+ for filename in filenames:
+ for ftp_filename in ftp_files:
+ if ftp_filename == filename:
+ path = relpath( os.path.join( dirpath, filename ), user_ftp_dir )
+ if not os.path.islink( os.path.join( dirpath, filename ) ):
+ ftp_data_file = { 'local_filename' : os.path.abspath( os.path.join( user_ftp_dir, path ) ),
+ 'filename' : os.path.basename( path ) }
+ file_bunch = get_data_file_filename( ftp_data_file, override_name = name, override_info = info )
+ if file_bunch.path:
+ break
+ if file_bunch.path:
+ break
+ if file_bunch.path:
+ break
file_bunch.space_to_tab = space_to_tab
return file_bunch, warnings
def get_filenames( context ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: kellyv: Modified script that adds manual builds to add build even if chrom length details not present; added a few new manual builds
by Bitbucket 10 Mar '11
by Bitbucket 10 Mar '11
10 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/3d53f4f548c7/
changeset: r5206:3d53f4f548c7
user: kellyv
date: 2011-03-10 19:28:33
summary: Modified script that adds manual builds to add build even if chrom length details not present; added a few new manual builds
affected #: 2 files (656 bytes)
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: rc: Extending sample search to include search using form fields. Resolved #489
by Bitbucket 09 Mar '11
by Bitbucket 09 Mar '11
09 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/b001ba3b7b3a/
changeset: r5205:b001ba3b7b3a
user: rc
date: 2011-03-09 21:33:04
summary: Extending sample search to include search using form fields. Resolved #489
affected #: 2 files (1.4 KB)
--- a/lib/galaxy/web/controllers/requests_common.py Wed Mar 09 14:50:54 2011 -0500
+++ b/lib/galaxy/web/controllers/requests_common.py Wed Mar 09 15:33:04 2011 -0500
@@ -748,6 +748,20 @@
trans.model.SampleDataset.table.c.sample_id==trans.model.Sample.table.c.id,
func.lower( trans.model.SampleDataset.table.c.name ).like( "%" + search_string.lower() + "%" ) ) ) \
.order_by( trans.model.Sample.table.c.create_time.desc() )
+ elif search_type == 'form value':
+ samples = []
+ if search_string.find('=') != -1:
+ field_label, field_value = search_string.split('=')
+ all_samples = trans.sa_session.query( trans.model.Sample ) \
+ .filter( trans.model.Sample.table.c.deleted==False ) \
+ .order_by( trans.model.Sample.table.c.create_time.desc() )
+ for sample in all_samples:
+ # find the field in the sample form with the given label
+ for field in sample.request.type.sample_form.fields:
+ if field_label == field['label']:
+ # check if the value is equal to the value in the search string
+ if sample.values.content[ field['name'] ] == field_value:
+ samples.append( sample )
if is_admin:
for s in samples:
if not s.request.deleted and s.request.state in request_states:
@@ -770,7 +784,7 @@
display='checkboxes' )
# Build the search_type SelectField
selected_value = kwd.get( 'search_type', 'sample name' )
- types = [ 'sample name', 'bar_code', 'dataset' ]
+ types = [ 'sample name', 'bar_code', 'dataset', 'form value' ]
search_type = build_select_field( trans, types, 'self', 'search_type', selected_value=selected_value, refresh_on_change=False )
# Build the search_box TextField
search_box = TextField( 'search_box', 50, kwd.get('search_box', '' ) )
--- a/templates/requests/common/find_samples.mako Wed Mar 09 14:50:54 2011 -0500
+++ b/templates/requests/common/find_samples.mako Wed Mar 09 15:33:04 2011 -0500
@@ -47,8 +47,15 @@
${search_box.get_html()}
<input type="submit" name="find_samples_button" value="Find"/><div class="toolParamHelp" style="clear: both;">
+ <p>
Wildcard search (%) can be used as placeholder for any sequence of characters or words.<br/>
For example, to search for samples starting with 'mysample' use 'mysample%' as the search string.
+ </p>
+ <p>
+ When 'form value' search type is selected, then enter the search string in 'field label=value' format.
+ <br/>For example, when searching for all samples whose 'Volume' field is 1.3mL, then the search string
+ should be 'Volume=1.3mL' (without qoutes).
+ </p></div></div>
%if results:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Add functional tests to cover some recent new library features, and a couple of miscellaneous UI improvements.
by Bitbucket 09 Mar '11
by Bitbucket 09 Mar '11
09 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/ba87157587e0/
changeset: r5204:ba87157587e0
user: greg
date: 2011-03-09 20:50:54
summary: Add functional tests to cover some recent new library features, and a couple of miscellaneous UI improvements.
affected #: 5 files (8.5 KB)
--- a/lib/galaxy/web/controllers/library_common.py Wed Mar 09 14:22:10 2011 -0500
+++ b/lib/galaxy/web/controllers/library_common.py Wed Mar 09 14:50:54 2011 -0500
@@ -2132,7 +2132,8 @@
def __build_target_folder_id_select_field( trans, folders, selected_value='none' ):
for folder in folders:
if not folder.parent:
- folder.name = 'Data library root folder'
+ folder.name = 'Data library root'
+ break
return build_select_field( trans,
objs=folders,
label_attr='name',
--- a/lib/galaxy/web/controllers/requests_common.py Wed Mar 09 14:22:10 2011 -0500
+++ b/lib/galaxy/web/controllers/requests_common.py Wed Mar 09 14:50:54 2011 -0500
@@ -1636,9 +1636,11 @@
else:
selected_folder_id = 'none'
folders = []
- # TODO: Change the name of the library root folder to "Library root" to clarify to the
- # user that it is the root folder. We probably should just change this in the Library code,
- # and update the data in the db.
+ # Change the name of the library root folder to clarify that it is the root
+ for folder in folders:
+ if not folder.parent:
+ folder.name = 'Data library root'
+ break
folder_select_field = build_select_field( trans,
folders,
'name',
--- a/templates/library/common/move_library_item.mako Wed Mar 09 14:22:10 2011 -0500
+++ b/templates/library/common/move_library_item.mako Wed Mar 09 14:50:54 2011 -0500
@@ -1,6 +1,16 @@
<%namespace file="/message.mako" import="render_msg" /><%inherit file="/base.mako"/>
+<%def name="javascripts()">
+ ${parent.javascripts()}
+ ${h.js("jquery.autocomplete", "autocomplete_tagging" )}
+</%def>
+
+<%def name="stylesheets()">
+ ${parent.stylesheets()}
+ ${h.css( "autocomplete_tagging" )}
+</%def>
+
<%
if source_library:
source_library_id = trans.security.encode_id( source_library.id )
--- a/test/base/twilltestcase.py Wed Mar 09 14:22:10 2011 -0500
+++ b/test/base/twilltestcase.py Wed Mar 09 14:50:54 2011 -0500
@@ -2144,7 +2144,8 @@
#tc.fv( "1", "do_action", format )
#tc.submit( "action_on_datasets_button" )
# Here's the new approach...
- url = "%s/library_common/act_on_multiple_datasets?cntrller=%s&library_id=%s&do_action=%s" % ( self.url, cntrller, library_id, format )
+ url = "%s/library_common/act_on_multiple_datasets?cntrller=%s&library_id=%s&do_action=%s" \
+ % ( self.url, cntrller, library_id, format )
for ldda_id in ldda_ids:
url += "&ldda_ids=%s" % ldda_id
self.visit_url( url )
@@ -2195,6 +2196,21 @@
errmsg += 'Unpacked archive remains in: %s\n' % tmpd
raise AssertionError( errmsg )
shutil.rmtree( tmpd )
+ def move_library_item( self, cntrller, item_type, item_id, source_library_id, make_target_current,
+ target_library_id='', target_folder_id='', strings_displayed=[], strings_displayed_after_submit=[] ):
+ self.home()
+ self.visit_url( "%s/library_common/move_library_item?cntrller=%s&item_type=%s&item_id=%s&source_library_id=%s&make_target_current=%s" \
+ % ( self.url, cntrller, item_type, item_id, source_library_id, make_target_current ) )
+ if target_library_id:
+ self.refresh_form( 'target_library_id', target_library_id )
+ if target_folder_id:
+ tc.fv( '1', 'target_folder_id', target_folder_id )
+ for check_str in strings_displayed:
+ self.check_page_for_string( check_str )
+ tc.submit( 'move_library_item_button' )
+ for check_str in strings_displayed_after_submit:
+ self.check_page_for_string( check_str )
+ self.home()
def delete_library_item( self, cntrller, library_id, item_id, item_name, item_type='library_dataset' ):
"""Mark a library item as deleted"""
self.home()
--- a/test/functional/test_library_features.py Wed Mar 09 14:22:10 2011 -0500
+++ b/test/functional/test_library_features.py Wed Mar 09 14:50:54 2011 -0500
@@ -35,7 +35,7 @@
def test_005_create_libraries( self ):
"""Testing creating libraries used in this script, then renaming one of them"""
# Logged in as admin_user
- for index in range( 0, 1 ):
+ for index in range( 0, 3 ):
name = 'library%s' % str( index + 1 )
description = '%s description' % name
synopsis = '%s synopsis' % name
@@ -45,6 +45,12 @@
global library1
library1 = get_library( 'library1', 'library1 description', 'library1 synopsis' )
assert library1 is not None, 'Problem retrieving library (library1) from the database'
+ global library2
+ library2 = get_library( 'library2', 'library2 description', 'library2 synopsis' )
+ assert library2 is not None, 'Problem retrieving library (library2) from the database'
+ global library3
+ library3 = get_library( 'library3', 'library3 description', 'library3 synopsis' )
+ assert library3 is not None, 'Problem retrieving library (library3) from the database'
# Rename the library
new_name = "library1 new name"
new_description = "library1 new description"
@@ -413,13 +419,128 @@
raise AssertionError( 'The library_dataset id %s named "%s" has not been marked as deleted.' % \
( str( library_dataset.id ), library_dataset.name ) )
check_folder( library1.root_folder )
+ def test_120_populate_public_library2( self ):
+ """Testing library datasets within a library"""
+ # Logged in as admin_user
+ # Add a folder named Three to library2 root
+ root_folder = library2.root_folder
+ name = "One"
+ description = "One description"
+ self.add_folder( 'library_admin',
+ self.security.encode_id( library2.id ),
+ self.security.encode_id( root_folder.id ),
+ name=name,
+ description=description )
+ global folder3
+ folder3 = get_folder( root_folder.id, name, description )
+ assert folder3 is not None, 'Problem retrieving library folder named "%s" from the database' % name
+ # Upload dataset 1.bed to folder One
+ filename = '1.bed'
+ ldda_message = "Testing uploading %s" % filename
+ self.upload_library_dataset( cntrller='library_admin',
+ library_id=self.security.encode_id( library2.id ),
+ folder_id=self.security.encode_id( folder3.id ),
+ filename=filename,
+ file_type='bed',
+ dbkey='hg18',
+ ldda_message=ldda_message,
+ strings_displayed=[ 'Upload files' ] )
+ global ldda5
+ ldda5 = get_latest_ldda_by_name( filename )
+ assert ldda5 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda5 from the database'
+ # Add a sub-folder named Two to folder One
+ name = "Two"
+ description = "Two description"
+ self.add_folder( 'library_admin',
+ self.security.encode_id( library2.id ),
+ self.security.encode_id( folder3.id ),
+ name=name,
+ description=description )
+ global folder4
+ folder4 = get_folder( folder3.id, name, description )
+ assert folder4 is not None, 'Problem retrieving library folder named "%s" from the database' % name
+ # Upload dataset 2.bed to folder Two
+ filename = '2.bed'
+ ldda_message = "Testing uploading %s" % filename
+ self.upload_library_dataset( cntrller='library_admin',
+ library_id=self.security.encode_id( library2.id ),
+ folder_id=self.security.encode_id( folder4.id ),
+ filename=filename,
+ file_type='bed',
+ dbkey='hg18',
+ ldda_message=ldda_message,
+ strings_displayed=[ 'Upload files' ] )
+ global ldda6
+ ldda6 = get_latest_ldda_by_name( filename )
+ assert ldda6 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda6 from the database'
+ # Add a folder named Three to library2 root
+ name = "Three"
+ description = "Three description"
+ self.add_folder( 'library_admin',
+ self.security.encode_id( library2.id ),
+ self.security.encode_id( root_folder.id ),
+ name=name,
+ description=description )
+ global folder5
+ folder5 = get_folder( root_folder.id, name, description )
+ assert folder5 is not None, 'Problem retrieving library folder named "%s" from the database' % name
+ # Upload dataset 3.bed to library2 root folder
+ filename = '3.bed'
+ ldda_message = "Testing uploading %s" % filename
+ self.upload_library_dataset( cntrller='library_admin',
+ library_id=self.security.encode_id( library2.id ),
+ folder_id=self.security.encode_id( root_folder.id ),
+ filename=filename,
+ file_type='bed',
+ dbkey='hg18',
+ ldda_message=ldda_message,
+ strings_displayed=[ 'Upload files' ] )
+ global ldda7
+ ldda7 = get_latest_ldda_by_name( filename )
+ assert ldda7 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda7 from the database'
+ def test_125_move_dataset_within_library2( self ):
+ """Testing moving a dataset within library2"""
+ # Logged in as admin_user
+ # Move 3.bed to folder Three
+ self.move_library_item( cntrller='library_admin',
+ item_type='ldda',
+ item_id=self.security.encode_id( ldda7.id ),
+ source_library_id=self.security.encode_id( library2.id ),
+ make_target_current=True,
+ target_folder_id=self.security.encode_id( folder5.id ),
+ strings_displayed=[ 'Move data library items',
+ '3.bed' ],
+ strings_displayed_after_submit=[ '1 dataset moved to folder (Three) within data library (library2)' ] )
+ def test_130_move_folder_to_another_library( self ):
+ """Testing moving a folder to another library"""
+ # Logged in as admin_user
+ # Move folder Three which now includes 3.bed to library3
+ self.move_library_item( cntrller='library_admin',
+ item_type='folder',
+ item_id=self.security.encode_id( folder5.id ),
+ source_library_id=self.security.encode_id( library2.id ),
+ make_target_current=False,
+ target_library_id=self.security.encode_id( library3.id ),
+ target_folder_id=self.security.encode_id( library3.root_folder.id ),
+ strings_displayed=[ 'Move data library items',
+ 'Three' ],
+ strings_displayed_after_submit=[ 'Moved folder (Three) to folder (library3) within data library (library3)' ] )
+ # Make sure folder Three is not longer in library2
+ self.browse_library( cntrller='library_admin',
+ library_id=self.security.encode_id( library2.id ),
+ strings_displayed=[ folder4.name, folder4.description ],
+ strings_not_displayed=[ folder5.name, folder5.description ] )
+ # Make sure folder Three was moved to library3
+ self.browse_library( cntrller='library_admin',
+ library_id=self.security.encode_id( library3.id ),
+ strings_displayed=[ folder5.name, folder5.description, ldda7.name ] )
def test_999_reset_data_for_later_test_runs( self ):
"""Reseting data to enable later test runs to pass"""
# Logged in as admin_user
##################
# Purge all libraries
##################
- for library in [ library1 ]:
+ for library in [ library1, library2, library3 ]:
self.delete_library_item( 'library_admin',
self.security.encode_id( library.id ),
self.security.encode_id( library.id ),
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Fix bugs in GFFReaderWrapper so that GFF3 files are read properly. Add GFF3 test to gff_filter_by_feature_count.
by Bitbucket 09 Mar '11
by Bitbucket 09 Mar '11
09 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/917fe1e0356e/
changeset: r5203:917fe1e0356e
user: jgoecks
date: 2011-03-09 20:22:10
summary: Fix bugs in GFFReaderWrapper so that GFF3 files are read properly. Add GFF3 test to gff_filter_by_feature_count.
affected #: 4 files (903 bytes)
--- a/lib/galaxy/datatypes/util/gff_util.py Tue Mar 08 19:01:59 2011 -0500
+++ b/lib/galaxy/datatypes/util/gff_util.py Wed Mar 09 14:22:10 2011 -0500
@@ -12,8 +12,18 @@
"""
def __init__( self, reader, fields, chrom_col, feature_col, start_col, end_col, \
strand_col, score_col, default_strand, fix_strand=False, raw_line='' ):
+ # HACK: GFF format allows '.' for strand but GenomicInterval does not. To get around this,
+ # temporarily set strand and then unset after initing GenomicInterval.
+ unknown_strand = False
+ if not fix_strand and fields[ strand_col ] == '.':
+ unknown_strand = True
+ fields[ strand_col ] = '+'
GenomicInterval.__init__( self, reader, fields, chrom_col, start_col, end_col, strand_col, \
default_strand, fix_strand=fix_strand )
+ if unknown_strand:
+ self.strand = '.'
+ self.fields[ strand_col ] = '.'
+
# Handle feature, score column.
self.feature_col = feature_col
if self.feature_col >= self.nfields:
@@ -40,13 +50,10 @@
self.intervals = intervals
# Use intervals to set feature attributes.
for interval in self.intervals:
- # Error checking.
+ # Error checking. NOTE: intervals need not share the same strand.
if interval.chrom != self.chrom:
- raise ValueError( "interval chrom does not match self chrom: %i != %i" % \
+ raise ValueError( "interval chrom does not match self chrom: %s != %s" % \
( interval.chrom, self.chrom ) )
- if interval.strand != self.strand:
- raise ValueError( "interval strand does not match self strand: %s != %s" % \
- ( interval.strand, self.strand ) )
# Set start, end of interval.
if interval.start < self.start:
self.start = interval.start
@@ -140,7 +147,7 @@
# For debugging, uncomment this to propogate parsing exceptions up.
# I.e. the underlying reason for an unexpected StopIteration exception
# can be found by uncommenting this.
- # raise e
+ #raise e
#
# Get next GFFFeature
@@ -163,7 +170,7 @@
# Initialize feature name from seed.
feature_group = self.seed_interval.attributes.get( 'group', None ) # For GFF
- feature_id = self.seed_interval.attributes.get( 'id', None ) # For GFF3
+ feature_id = self.seed_interval.attributes.get( 'ID', None ) # For GFF3
feature_gene_id = self.seed_interval.attributes.get( 'gene_id', None ) # For GTF
feature_transcript_id = self.seed_interval.attributes.get( 'transcript_id', None ) # For GTF
@@ -183,11 +190,14 @@
# If interval not associated with feature, break.
group = interval.attributes.get( 'group', None )
+ # GFF test:
if group and feature_group != group:
break
- id = interval.attributes.get( 'id', None )
- if id and feature_id != id:
+ # GFF3 test:
+ parent = interval.attributes.get( 'Parent', None )
+ if feature_id and feature_id != parent:
break
+ # GTF test:
gene_id = interval.attributes.get( 'gene_id', None )
transcript_id = interval.attributes.get( 'transcript_id', None )
if ( transcript_id and transcript_id != feature_transcript_id ) or \
--- a/tools/filters/gff/gff_filter_by_feature_count.py Tue Mar 08 19:01:59 2011 -0500
+++ b/tools/filters/gff/gff_filter_by_feature_count.py Wed Mar 09 14:22:10 2011 -0500
@@ -8,8 +8,7 @@
import sys
from galaxy import eggs
from galaxy.datatypes.util.gff_util import GFFReaderWrapper
-
-assert sys.version_info[:2] >= ( 2, 4 )
+from bx.intervals.io import GenomicInterval
# Valid operators, ordered so that complex operators (e.g. '>=') are
# recognized before simple operators (e.g. '>')
@@ -62,7 +61,9 @@
skipped_lines = 0
first_skipped_line = 0
out = open( output_name, 'w' )
- for i, feature in enumerate( GFFReaderWrapper( open( input_name ), fix_strand=True ) ):
+ for i, feature in enumerate( GFFReaderWrapper( open( input_name ) ) ):
+ if not isinstance( feature, GenomicInterval ):
+ continue
count = 0
for interval in feature.intervals:
if interval.feature == feature_name:
@@ -73,6 +74,9 @@
out.write( "\t".join(interval.fields) + '\n' )
kept_features += 1
+ # Needed because i is 0-based but want to display stats using 1-based.
+ i += 1
+
# Clean up.
out.close()
info_msg = "%i of %i features kept (%.2f%%) using condition %s. " % \
--- a/tools/filters/gff/gff_filter_by_feature_count.xml Tue Mar 08 19:01:59 2011 -0500
+++ b/tools/filters/gff/gff_filter_by_feature_count.xml Wed Mar 09 14:22:10 2011 -0500
@@ -20,12 +20,20 @@
<data format="input" name="out_file1" metadata_source="input_file1"/></outputs><tests>
- <test>
- <param name="input_file1" value="gops_subtract_in1.gff"/>
- <param name="feature_name" value="exon"/>
- <param name="cond" value=">1"/>
- <output name="out_file1" file="gff_filter_by_feature_count_out1.gff"/>
- </test>
+ <!-- Test GTF filtering. -->
+ <test>
+ <param name="input_file1" value="gops_subtract_in1.gff"/>
+ <param name="feature_name" value="exon"/>
+ <param name="cond" value=">1"/>
+ <output name="out_file1" file="gff_filter_by_feature_count_out1.gff"/>
+ </test>
+ <!-- Test GFF3 filtering. -->
+ <test>
+ <param name="input_file1" value="5.gff3"/>
+ <param name="feature_name" value="HSP"/>
+ <param name="cond" value=">=5"/>
+ <output name="out_file1" file="gff_filter_by_feature_count_out2.gff"/>
+ </test></tests><help>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Use GFFReaderWrapper in gff_filter_by_feature_count tool in order to leverage support for GTF, GFF, and GFF3 found in reader wrapper. This also simplifies the tool considerably.
by Bitbucket 08 Mar '11
by Bitbucket 08 Mar '11
08 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/c9e6bc81817a/
changeset: r5202:c9e6bc81817a
user: jgoecks
date: 2011-03-09 01:01:59
summary: Use GFFReaderWrapper in gff_filter_by_feature_count tool in order to leverage support for GTF, GFF, and GFF3 found in reader wrapper. This also simplifies the tool considerably.
affected #: 2 files (2.7 KB)
--- a/lib/galaxy/datatypes/util/gff_util.py Tue Mar 08 16:17:09 2011 -0500
+++ b/lib/galaxy/datatypes/util/gff_util.py Tue Mar 08 19:01:59 2011 -0500
@@ -10,14 +10,18 @@
A GFF interval, including attributes. If file is strictly a GFF file,
only attribute is 'group.'
"""
- def __init__( self, reader, fields, chrom_col, start_col, end_col, strand_col, \
- score_col, default_strand, fix_strand=False, raw_line='' ):
+ def __init__( self, reader, fields, chrom_col, feature_col, start_col, end_col, \
+ strand_col, score_col, default_strand, fix_strand=False, raw_line='' ):
GenomicInterval.__init__( self, reader, fields, chrom_col, start_col, end_col, strand_col, \
default_strand, fix_strand=fix_strand )
- # Handle score column.
+ # Handle feature, score column.
+ self.feature_col = feature_col
+ if self.feature_col >= self.nfields:
+ raise MissingFieldError( "No field for feature_col (%d)" % feature_col )
+ self.feature = self.fields[ self.feature_col ]
self.score_col = score_col
if self.score_col >= self.nfields:
- raise MissingFieldError( "No field for score_col (%d)" % score_col )
+ raise MissingFieldError( "No field for score_col (%d)" % score_col )
self.score = self.fields[ self.score_col ]
# Attributes specific to GFF.
@@ -28,10 +32,11 @@
"""
A GFF feature, which can include multiple intervals.
"""
- def __init__( self, reader, chrom_col, start_col, end_col, strand_col, score_col, default_strand, \
- fix_strand=False, intervals=[] ):
- GFFInterval.__init__( self, reader, intervals[0].fields, chrom_col, start_col, end_col, \
- strand_col, score_col, default_strand, fix_strand=fix_strand )
+ def __init__( self, reader, chrom_col, feature_col, start_col, end_col, \
+ strand_col, score_col, default_strand, fix_strand=False, intervals=[] ):
+ GFFInterval.__init__( self, reader, intervals[0].fields, chrom_col, feature_col, \
+ start_col, end_col, strand_col, score_col, default_strand, \
+ fix_strand=fix_strand )
self.intervals = intervals
# Use intervals to set feature attributes.
for interval in self.intervals:
@@ -99,20 +104,20 @@
expect traditional interval format.
"""
- def __init__( self, reader, chrom_col=0, start_col=3, end_col=4, strand_col=6, score_col=5, \
- fix_strand=False, **kwargs ):
+ def __init__( self, reader, chrom_col=0, feature_col=2, start_col=3, \
+ end_col=4, strand_col=6, score_col=5, fix_strand=False, **kwargs ):
NiceReaderWrapper.__init__( self, reader, chrom_col=chrom_col, start_col=start_col, end_col=end_col, \
strand_col=strand_col, fix_strand=fix_strand, **kwargs )
- # HACK: NiceReaderWrapper (bx-python) does not handle score_col yet, so store ourselves.
+ self.feature_col = feature_col
self.score_col = score_col
self.last_line = None
self.cur_offset = 0
self.seed_interval = None
def parse_row( self, line ):
- interval = GFFInterval( self, line.split( "\t" ), self.chrom_col, self.start_col, \
- self.end_col, self.strand_col, self.score_col, self.default_strand, \
- fix_strand=self.fix_strand, raw_line=line )
+ interval = GFFInterval( self, line.split( "\t" ), self.chrom_col, self.feature_col, \
+ self.start_col, self.end_col, self.strand_col, self.score_col, \
+ self.default_strand, fix_strand=self.fix_strand, raw_line=line )
return interval
def next( self ):
@@ -196,8 +201,9 @@
self.seed_interval = interval
# Return GFF feature with all intervals.
- return GFFFeature( self, self.chrom_col, self.start_col, self.end_col, self.strand_col, \
- self.score_col, self.default_strand, fix_strand=self.fix_strand, \
+ return GFFFeature( self, self.chrom_col, self.feature_col, self.start_col, \
+ self.end_col, self.strand_col, self.score_col, \
+ self.default_strand, fix_strand=self.fix_strand, \
intervals=feature_intervals )
--- a/tools/filters/gff/gff_filter_by_feature_count.py Tue Mar 08 16:17:09 2011 -0500
+++ b/tools/filters/gff/gff_filter_by_feature_count.py Tue Mar 08 19:01:59 2011 -0500
@@ -7,7 +7,7 @@
"""
import sys
from galaxy import eggs
-from galaxy.datatypes.util.gff_util import parse_gff_attributes
+from galaxy.datatypes.util.gff_util import GFFReaderWrapper
assert sys.version_info[:2] >= ( 2, 4 )
@@ -58,77 +58,25 @@
break
# Do filtering.
- kept_lines = 0
+ kept_features = 0
skipped_lines = 0
first_skipped_line = 0
out = open( output_name, 'w' )
- i = 0
- cur_transcript_id = None
- cur_transcript_lines = []
- cur_transcript_feature_counts = {} # Key is feature name, value is feature count.
- for i, line in enumerate( file( input_name ) ):
- line = line.rstrip( '\r\n' )
- if line and not line.startswith( '#' ):
- try:
- # GFF format: chrom, source, feature, chromStart, chromEnd, score, strand, attributes
- elems = line.split( '\t' )
- feature = elems[2]
- start = str( long( elems[3] ) - 1 )
- coords = [ long( start ), long( elems[4] ) ]
- strand = elems[6]
- attributes = parse_gff_attributes( elems[8] )
- t_id = attributes.get( "transcript_id", None )
-
- if not t_id:
- # No transcript id, so pass line to output.
- out.write( line )
- kept_lines += 1
- continue
-
- # There is a transcript ID, so process line at transcript level.
- if t_id == cur_transcript_id:
- # Line is element of transcript; increment feature count.
- if not feature in cur_transcript_feature_counts:
- cur_transcript_feature_counts[feature] = 0
- cur_transcript_feature_counts[feature] += 1
- cur_transcript_lines.append( line )
- continue
-
- #
- # Line is part of new transcript; filter previous transcript.
- #
-
- # Filter/write previous transcript.
- result = eval( '%s %s' % ( cur_transcript_feature_counts.get( feature_name, 0 ), condition ) )
- if cur_transcript_id and result:
- # Transcript passes filter; write transcript line to file."
- out.write( "\n".join( cur_transcript_lines ) + "\n" )
- kept_lines += len( cur_transcript_lines )
-
- # Start new transcript.
- cur_transcript_id = t_id
- cur_transcript_feature_counts = {}
- cur_transcript_feature_counts[feature] = 1
- cur_transcript_lines = [ line ]
- except Exception, e:
- print e
- skipped_lines += 1
- if not first_skipped_line:
- first_skipped_line = i + 1
- else:
- skipped_lines += 1
- if not first_skipped_line:
- first_skipped_line = i + 1
-
- # Write last transcript.
- if cur_transcript_id and eval( '%s %s' % ( cur_transcript_feature_counts[feature_name], condition ) ):
- # Transcript passes filter; write transcript lints to file.
- out.write( "\n".join( cur_transcript_lines ) + "\n" )
- kept_lines += len( cur_transcript_lines )
+ for i, feature in enumerate( GFFReaderWrapper( open( input_name ), fix_strand=True ) ):
+ count = 0
+ for interval in feature.intervals:
+ if interval.feature == feature_name:
+ count += 1
+ if eval( '%s %s' % ( count, condition ) ):
+ # Keep feature.
+ for interval in feature.intervals:
+ out.write( "\t".join(interval.fields) + '\n' )
+ kept_features += 1
# Clean up.
out.close()
- info_msg = "%i lines kept (%.2f%%) using condition %s. " % ( kept_lines, float(kept_lines)/i * 100.0, feature_name + condition )
+ info_msg = "%i of %i features kept (%.2f%%) using condition %s. " % \
+ ( kept_features, i, float(kept_features)/i * 100.0, feature_name + condition )
if skipped_lines > 0:
info_msg += "Skipped %d blank/comment/invalid lines starting with line #%d." %( skipped_lines, first_skipped_line )
print info_msg
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: kellyv: Update SRMA test files so that functional tests pass
by Bitbucket 08 Mar '11
by Bitbucket 08 Mar '11
08 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/575531ef4a53/
changeset: r5201:575531ef4a53
user: kellyv
date: 2011-03-08 22:17:09
summary: Update SRMA test files so that functional tests pass
affected #: 5 files (10.7 MB)
Diff too large to display.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Add the ability to move data library items within a data library or between data libraries.
by Bitbucket 08 Mar '11
by Bitbucket 08 Mar '11
08 Mar '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/ed7b6180b925/
changeset: r5200:ed7b6180b925
user: greg
date: 2011-03-08 21:50:19
summary: Add the ability to move data library items within a data library or between data libraries.
affected #: 7 files (16.8 KB)
--- a/lib/galaxy/model/__init__.py Tue Mar 08 14:35:26 2011 -0500
+++ b/lib/galaxy/model/__init__.py Tue Mar 08 15:50:19 2011 -0500
@@ -941,6 +941,28 @@
self.description = description
self.synopsis = synopsis
self.root_folder = root_folder
+ def get_active_folders( self, folder, folders=None ):
+ # TODO: should we make sure the library is not deleted?
+ def sort_by_attr( seq, attr ):
+ """
+ Sort the sequence of objects by object's attribute
+ Arguments:
+ seq - the list or any sequence (including immutable one) of objects to sort.
+ attr - the name of attribute to sort by
+ """
+ # Use the "Schwartzian transform"
+ # Create the auxiliary list of tuples where every i-th tuple has form
+ # (seq[i].attr, i, seq[i]) and sort it. The second item of tuple is needed not
+ # only to provide stable sorting, but mainly to eliminate comparison of objects
+ # (which can be expensive or prohibited) in case of equal attribute values.
+ intermed = map( None, map( getattr, seq, ( attr, ) * len( seq ) ), xrange( len( seq ) ), seq )
+ intermed.sort()
+ return map( operator.getitem, intermed, ( -1, ) * len( intermed ) )
+ if folders is None:
+ active_folders = [ folder ]
+ for active_folder in folder.active_folders:
+ active_folders.extend( self.get_active_folders( active_folder, folders ) )
+ return sort_by_attr( active_folders, 'id' )
def get_info_association( self, restrict=False, inherited=False ):
if self.info_association:
if not inherited or self.info_association[0].inheritable:
--- a/lib/galaxy/web/controllers/library_common.py Tue Mar 08 14:35:26 2011 -0500
+++ b/lib/galaxy/web/controllers/library_common.py Tue Mar 08 15:50:19 2011 -0500
@@ -1539,16 +1539,6 @@
show_deleted = util.string_as_bool( params.get( 'show_deleted', False ) )
use_panels = util.string_as_bool( params.get( 'use_panels', False ) )
action = params.get( 'do_action', None )
- if action == 'import_to_histories':
- return trans.response.send_redirect( web.url_for( controller='library_common',
- action='import_datasets_to_histories',
- cntrller=cntrller,
- library_id=library_id,
- ldda_ids=ldda_ids,
- use_panels=use_panels,
- show_deleted=show_deleted,
- message=message,
- status=status ) )
lddas = []
error = False
is_admin = trans.user_is_admin() and cntrller == 'library_admin'
@@ -1560,6 +1550,31 @@
error = True
message = 'You must select an action to perform on the selected datasets.'
else:
+ if action == 'import_to_histories':
+ return trans.response.send_redirect( web.url_for( controller='library_common',
+ action='import_datasets_to_histories',
+ cntrller=cntrller,
+ library_id=library_id,
+ ldda_ids=ldda_ids,
+ use_panels=use_panels,
+ show_deleted=show_deleted,
+ message=message,
+ status=status ) )
+ if action == 'move':
+ if library_id in [ 'none', 'None', None ]:
+ source_library_id = ''
+ else:
+ source_library_id = library_id
+ return trans.response.send_redirect( web.url_for( controller='library_common',
+ action='move_library_item',
+ cntrller=cntrller,
+ source_library_id=source_library_id,
+ item_type='ldda',
+ item_id=ldda_ids,
+ use_panels=use_panels,
+ show_deleted=show_deleted,
+ message=message,
+ status=status ) )
ldda_ids = util.listify( ldda_ids )
for ldda_id in ldda_ids:
try:
@@ -1779,9 +1794,11 @@
@web.expose
def import_datasets_to_histories( self, trans, cntrller, library_id='', folder_id='', ldda_ids='', target_history_ids='', new_history_name='', **kwd ):
# This method is called from one of the following places:
- # - a menu option for a library dataset ( ldda_ids will be a singel dataset id )
- # - a menu option for a library folder ( folder_id will have a value )
- # - a menu option for a library dataset search result set ( ldda_ids will be a comma separated string of dataset ids )
+ # - a menu option for a library dataset ( ldda_ids is a single ldda id )
+ # - a menu option for a library folder ( folder_id has a value )
+ # - a select list option for acting on multiple selected datasets within a library
+ # ( ldda_ids is a comma separated string of ldda ids )
+ # - a menu option for a library dataset search result set ( ldda_ids is a comma separated string of ldda ids )
params = util.Params( kwd )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
@@ -1799,13 +1816,7 @@
folder = None
ldda_ids = util.listify( ldda_ids )
if ldda_ids:
- # Check boxes cause 2 copies of each id to be included in the request
ldda_ids = map( trans.security.decode_id, ldda_ids )
- unique_ldda_ids = []
- for ldda_id in ldda_ids:
- if ldda_id not in unique_ldda_ids:
- unique_ldda_ids.append( ldda_id )
- ldda_ids = unique_ldda_ids
target_history_ids = util.listify( target_history_ids )
if target_history_ids:
target_history_ids = [ trans.security.decode_id( target_history_id ) for target_history_id in target_history_ids if target_history_id ]
@@ -1831,15 +1842,15 @@
status = 'error'
for ldda in map( trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get, ldda_ids ):
if ldda is None:
- message += "You tried to import a library dataset that does not exist. "
+ message += "You tried to import a dataset that does not exist. "
status = 'error'
invalid_datasets += 1
elif ldda.dataset.state not in [ trans.model.Dataset.states.OK, trans.model.Dataset.states.ERROR ]:
- message += "Cannot import dataset '%s' since its state is '%s'. " % ( ldda.name, ldda.dataset.state )
+ message += "You cannot import dataset '%s' since its state is '%s'. " % ( ldda.name, ldda.dataset.state )
status = 'error'
invalid_datasets += 1
elif not ldda.has_data():
- message += "Cannot import empty dataset '%s'. " % ldda.name
+ message += "You cannot import empty dataset '%s'. " % ldda.name
status = 'error'
invalid_datasets += 1
else:
@@ -1931,6 +1942,226 @@
message=util.sanitize_text( message ),
status='done' ) )
@web.expose
+ def move_library_item( self, trans, cntrller, item_type, item_id, source_library_id='', make_target_current=True, **kwd ):
+ # This method is called from one of the following places:
+ # - a menu option for a library dataset ( item_type is 'ldda' and item_id is a single ldda id )
+ # - a menu option for a library folder ( item_type is 'folder' and item_id is a single folder id )
+ # - a select list option for acting on multiple selected datasets within a library ( item_type is
+ # 'ldda' and item_id is a comma separated string of ldda ids )
+ # - a menu option for a library dataset search result set ( item_type is 'ldda' and item_id is a
+ # comma separated string of ldda ids )
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ show_deleted = util.string_as_bool( params.get( 'show_deleted', False ) )
+ use_panels = util.string_as_bool( params.get( 'use_panels', False ) )
+ make_target_current = util.string_as_bool( make_target_current )
+ is_admin = trans.user_is_admin() and cntrller == 'library_admin'
+ user = trans.get_user()
+ current_user_roles = trans.get_current_user_roles()
+ move_ldda_ids = []
+ move_lddas = []
+ move_folder_id = []
+ move_folder = None
+ if source_library_id:
+ source_library = trans.sa_session.query( trans.model.Library ).get( trans.security.decode_id( source_library_id ) )
+ else:
+ # Request sent from the library_dataset_search_results page.
+ source_library = None
+ target_library_id = params.get( 'target_library_id', '' )
+ if target_library_id not in [ '', 'none', None ]:
+ target_library = trans.sa_session.query( trans.model.Library ).get( trans.security.decode_id( target_library_id ) )
+ elif make_target_current:
+ target_library = source_library
+ else:
+ target_library = None
+ target_folder_id = params.get( 'target_folder_id', '' )
+ if target_folder_id not in [ '', 'none', None ]:
+ target_folder = trans.sa_session.query( trans.model.LibraryFolder ).get( trans.security.decode_id( target_folder_id ) )
+ if target_library is None:
+ target_library = target_folder.parent_library
+ else:
+ target_folder = None
+ if item_type == 'ldda':
+ # We've been called from a menu option for a library dataset search result set
+ move_ldda_ids = util.listify( item_id )
+ if move_ldda_ids:
+ # Checkboxes cause 2 copies of each id to be included in the request
+ move_ldda_ids = map( trans.security.decode_id, move_ldda_ids )
+ unique_ldda_ids = []
+ for ldda_id in move_ldda_ids:
+ if ldda_id not in unique_ldda_ids:
+ unique_ldda_ids.append( ldda_id )
+ move_ldda_ids = unique_ldda_ids
+ elif item_type == 'folder':
+ move_folder_id = item_id
+ move_folder = trans.sa_session.query( trans.model.LibraryFolder ).get( trans.security.decode_id( move_folder_id ) )
+ if params.get( 'move_library_item_button', False ):
+ if not ( move_ldda_ids or move_folder_id ) or target_folder_id in [ '', 'none', None ]:
+ message = "You must select a source folder or one or more source datasets, and a target folder."
+ status = 'error'
+ else:
+ valid_lddas = []
+ invalid_lddas = []
+ invalid_items = 0
+ flush_required = False
+ if item_type == 'ldda':
+ for ldda in map( trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get, move_ldda_ids ):
+ if ldda is None:
+ message += "You tried to move a dataset that does not exist. "
+ status = 'error'
+ invalid_items += 1
+ elif ldda.dataset.state not in [ trans.model.Dataset.states.OK, trans.model.Dataset.states.ERROR ]:
+ message += "You cannot move dataset '%s' since its state is '%s'. " % ( ldda.name, ldda.dataset.state )
+ status = 'error'
+ invalid_items += 1
+ elif not ldda.has_data():
+ message += "You cannot move empty dataset '%s'. " % ldda.name
+ status = 'error'
+ invalid_items += 1
+ else:
+ if is_admin:
+ library_dataset = ldda.library_dataset
+ library_dataset.folder = target_folder
+ trans.sa_session.add( library_dataset )
+ flush_required = True
+ else:
+ if trans.app.security_agent.can_modify_library_item( current_user_roles, ldda ):
+ valid_lddas.append( ldda )
+ library_dataset = ldda.library_dataset
+ library_dataset.folder = target_folder
+ trans.sa_session.add( library_dataset )
+ flush_required = True
+ else:
+ invalid_items += 1
+ invalid_lddas.append( ldda )
+ if not valid_lddas:
+ message = "You are not authorized to move any of the selected datasets."
+ elif invalid_lddas:
+ message += "You are not authorized to move %s: " % inflector.cond_plural( len( invalid_lddas ), "dataset" )
+ for ldda in invalid_lddas:
+ message += '(%s)' % ldda.name
+ message += '. '
+ num_source = len( move_ldda_ids ) - invalid_items
+ message = "%i %s moved to folder (%s) within data library (%s)" % ( num_source,
+ inflector.cond_plural( num_source, "dataset" ),
+ target_folder.name,
+ target_library.name )
+ elif item_type == 'folder':
+ move_folder = trans.sa_session.query( trans.app.model.LibraryFolder ) \
+ .get( trans.security.decode_id( move_folder_id ) )
+ if move_folder is None:
+ message += "You tried to move a folder that does not exist. "
+ status = 'error'
+ invalid_items += 1
+ else:
+ move_folder.parent = target_folder
+ trans.sa_session.add( move_folder )
+ flush_required = True
+ message = "Moved folder (%s) to folder (%s) within data library (%s) " % ( move_folder.name,
+ target_folder.name,
+ target_library.name )
+ if flush_required:
+ trans.sa_session.flush()
+ if target_library:
+ if is_admin:
+ target_library_folders = target_library.get_active_folders( target_library.root_folder )
+ else:
+ folders_with_permission_to_add = []
+ for folder in target_library.get_active_folders( target_library.root_folder ):
+ if trans.app.security_agent.can_add_library_item( current_user_roles, folder ):
+ folders_with_permission_to_add.append( folder )
+ target_library_folders = folders_with_permission_to_add
+ else:
+ target_library_folders = []
+ if item_type == 'ldda':
+ for ldda_id in move_ldda_ids:
+ # TODO: It is difficult to filter out undesired folders (e.g. the ldda's current
+ # folder) if we have a list of lddas, but we may want to filter folders that
+ # are easily handled.
+ ldda = trans.sa_session.query( trans.model.LibraryDatasetDatasetAssociation ).get( ldda_id )
+ move_lddas.append( ldda )
+ elif item_type == 'folder':
+ def __is_contained_in( folder1, folder2 ):
+ # Return True if folder1 is contained in folder2
+ if folder1.parent:
+ if folder1.parent == folder2:
+ return True
+ return __is_contained_in( folder1.parent, folder2 )
+ return False
+ filtered_folders = []
+ for folder in target_library_folders:
+ include = True
+ if move_folder:
+ if __is_contained_in( folder, move_folder ):
+ # Don't allow moving a folder to one of it's sub-folders (circular issues in db)
+ include = False
+ if move_folder.id == folder.id:
+ # Don't allow moving a folder to itself
+ include = False
+ if move_folder.parent and move_folder.parent.id == folder.id:
+ # Don't allow moving a folder to it's current parent folder
+ include = False
+ if include:
+ filtered_folders.append( folder )
+ target_library_folders = filtered_folders
+ def __build_target_library_id_select_field( trans, selected_value='none' ):
+ # Get all the libraries for which the current user can add items.
+ target_libraries = []
+ if is_admin:
+ for library in trans.sa_session.query( trans.model.Library ) \
+ .filter( trans.model.Library.deleted == False ) \
+ .order_by( trans.model.Library.table.c.name ):
+ if source_library is None or library.id != source_library.id:
+ target_libraries.append( library )
+ else:
+ for library in trans.app.security_agent.get_accessible_libraries( trans, user ):
+ if source_library is None:
+ if trans.app.security_agent.can_add_library_item( current_user_roles, library ):
+ target_libraries.append( library )
+ elif library.id != source_library.id:
+ if trans.app.security_agent.can_add_library_item( current_user_roles, library ):
+ target_libraries.append( library )
+ # A refresh_on_change is required to display the selected library's folders
+ return build_select_field( trans,
+ objs=target_libraries,
+ label_attr='name',
+ select_field_name='target_library_id',
+ selected_value=selected_value,
+ refresh_on_change=True )
+ def __build_target_folder_id_select_field( trans, folders, selected_value='none' ):
+ for folder in folders:
+ if not folder.parent:
+ folder.name = 'Data library root folder'
+ return build_select_field( trans,
+ objs=folders,
+ label_attr='name',
+ select_field_name='target_folder_id',
+ selected_value=selected_value,
+ refresh_on_change=False )
+ if target_library:
+ selected_value = target_library.id
+ else:
+ selected_value = 'none'
+ target_library_id_select_field = __build_target_library_id_select_field( trans, selected_value=selected_value )
+ target_folder_id_select_field = __build_target_folder_id_select_field( trans, target_library_folders )
+ return trans.fill_template( "/library/common/move_library_item.mako",
+ cntrller=cntrller,
+ make_target_current=make_target_current,
+ source_library=source_library,
+ item_type=item_type,
+ item_id=item_id,
+ move_ldda_ids=move_ldda_ids,
+ move_lddas=move_lddas,
+ move_folder=move_folder,
+ target_library=target_library,
+ target_library_id_select_field=target_library_id_select_field,
+ target_folder_id_select_field=target_folder_id_select_field,
+ show_deleted=show_deleted,
+ use_panels=use_panels,
+ message=message,
+ status=status )
+ @web.expose
def delete_library_item( self, trans, cntrller, library_id, item_id, item_type, **kwd ):
# This action will handle deleting all types of library items. State is saved for libraries and
# folders ( i.e., if undeleted, the state of contents of the library or folder will remain, so previously
@@ -2257,7 +2488,7 @@
# Perform search
parser = MultifieldParser( [ 'name', 'info', 'dbkey', 'message' ], schema=schema )
# Search term with wildcards may be slow...
- results = searcher.search( parser.parse( '*' + search_term + '*' ), minscore=0.5 )
+ results = searcher.search( parser.parse( '*' + search_term + '*' ), minscore=0.01 )
ldda_ids = [ result[ 'id' ] for result in results ]
lddas = []
for ldda_id in ldda_ids:
--- a/templates/library/common/browse_library.mako Tue Mar 08 14:35:26 2011 -0500
+++ b/templates/library/common/browse_library.mako Tue Mar 08 15:50:19 2011 -0500
@@ -247,19 +247,20 @@
checked="checked"
%endif
/>
- %if ldda.library_dataset.deleted:
- <span class="libraryItem-error">
- %endif
<div style="float: left; margin-left: 1px;" class="menubutton split popup" id="dataset-${ldda.id}-popup">
- <a class="view-info" href="${h.url_for( controller='library_common', action='ldda_info', cntrller=cntrller, library_id=trans.security.encode_id( library.id ), folder_id=trans.security.encode_id( folder.id ), id=trans.security.encode_id( ldda.id ), use_panels=use_panels, show_deleted=show_deleted )}">${ldda.name}</a>
+ <a class="view-info" href="${h.url_for( controller='library_common', action='ldda_info', cntrller=cntrller, library_id=trans.security.encode_id( library.id ), folder_id=trans.security.encode_id( folder.id ), id=trans.security.encode_id( ldda.id ), use_panels=use_panels, show_deleted=show_deleted )}">
+ %if ldda.library_dataset.deleted:
+ <div class="libraryItem-error">${ldda.name}</div>
+ %else:
+ ${ldda.name}
+ %endif
+ </a></div>
- %if ldda.library_dataset.deleted:
- </span>
- %endif
%if not library.deleted:
<div popupmenu="dataset-${ldda.id}-popup">
%if not branch_deleted( folder ) and not ldda.library_dataset.deleted and can_modify:
<a class="action-button" href="${h.url_for( controller='library_common', action='ldda_edit_info', cntrller=cntrller, library_id=trans.security.encode_id( library.id ), folder_id=trans.security.encode_id( folder.id ), id=trans.security.encode_id( ldda.id ), use_panels=use_panels, show_deleted=show_deleted )}">Edit information</a>
+ <a class="action-button" href="${h.url_for( controller='library_common', action='move_library_item', cntrller=cntrller, item_type='ldda', item_id=trans.security.encode_id( ldda.id ), source_library_id=trans.security.encode_id( library.id ), use_panels=use_panels, show_deleted=show_deleted )}">Move this dataset</a>
%else:
<a class="action-button" href="${h.url_for( controller='library_common', action='ldda_info', cntrller=cntrller, library_id=trans.security.encode_id( library.id ), folder_id=trans.security.encode_id( folder.id ), id=trans.security.encode_id( ldda.id ), use_panels=use_panels, show_deleted=show_deleted )}">View information</a>
%endif
@@ -348,7 +349,7 @@
info_association, inherited = folder.get_info_association( restrict=True )
%>
%if not root_folder and ( not folder.deleted or show_deleted ):
- <% encoded_id = trans.security.encode_id(folder.id) %>
+ <% encoded_id = trans.security.encode_id( folder.id ) %><tr id="folder-${encoded_id}" class="folderRow libraryOrFolderRow"
%if parent is not None:
parent="${parent}"
@@ -358,15 +359,16 @@
<td style="padding-left: ${folder_pad}px;"><input type="checkbox" class="folderCheckbox"/><span class="expandLink folder-${encoded_id}-click">
- <div style="float: left; margin-left: 2px;" class="menubutton split popup" id="folder_img-${folder.id}-popup">
- <a class="folder-${encoded_id}-click" href="javascript:void(0);">
- %if folder.deleted:
- <span class="libraryItem-error">${folder.name}</span>
- %else:
- ${folder.name}
- %endif
- </a>
- </div>
+ <div style="float: left; margin-left: 2px;" class="menubutton split popup" id="folder_img-${folder.id}-popup">
+ <a class="folder-${encoded_id}-click" href="javascript:void(0);">
+ %if folder.deleted:
+ <div class="libraryItem-error">${folder.name}</div>
+ %else:
+ ${folder.name}
+ %endif
+ </a>
+ </div>
+ </span>
%if not library.deleted:
<div popupmenu="folder_img-${folder.id}-popup">
%if not branch_deleted( folder ) and can_add:
@@ -379,6 +381,7 @@
%endif
%if can_modify:
<a class="action-button" href="${h.url_for( controller='library_common', action='folder_info', cntrller=cntrller, id=trans.security.encode_id( folder.id ), library_id=trans.security.encode_id( library.id ), use_panels=use_panels, show_deleted=show_deleted )}">Edit information</a>
+ <a class="action-button" href="${h.url_for( controller='library_common', action='move_library_item', cntrller=cntrller, item_type='folder', item_id=trans.security.encode_id( folder.id ), source_library_id=trans.security.encode_id( library.id ), use_panels=use_panels, show_deleted=show_deleted )}">Move this folder</a>
%else:
<a class="action-button" class="view-info" href="${h.url_for( controller='library_common', action='folder_info', cntrller=cntrller, id=trans.security.encode_id( folder.id ), library_id=trans.security.encode_id( library.id ), use_panels=use_panels, show_deleted=show_deleted )}">View information</a>
%endif
--- a/templates/library/common/common.mako Tue Mar 08 14:35:26 2011 -0500
+++ b/templates/library/common/common.mako Tue Mar 08 15:50:19 2011 -0500
@@ -399,17 +399,18 @@
can_download = 'download' not in actions_to_exclude
can_import_to_histories = 'import_to_histories' not in actions_to_exclude
can_manage_permissions = 'manage_permissions' not in actions_to_exclude
+ can_move = 'move' not in actions_to_exclude
%><tfoot><tr><td colspan="5" style="padding-left: 42px;">
- For selected items:
+ For selected datasets:
<select name="do_action" id="action_on_selected_items">
%if can_import_to_histories:
%if not is_admin and default_action == 'import_to_histories':
- <option value="import_to_histories" selected>Import selected datasets to histories</option>
+ <option value="import_to_histories" selected>Import to histories</option>
%else:
- <option value="import_to_histories">Import selected datasets to histories</option>
+ <option value="import_to_histories">Import to histories</option>
%endif
%endif
%if can_manage_permissions:
@@ -419,13 +420,19 @@
<option value="manage_permissions">Edit permissions</option>
%endif
%endif
+ %if can_move:
+ <option value="move">Move</option>
+ %endif
+ %if can_delete:
+ <option value="delete">Delete</option>
+ %endif
%if can_download:
%if 'gz' in comptypes:
<option value="tgz"
%if default_action == 'download':
selected
%endif>
- >Download as a .tar.gz file</option>
+ >Download as a .tar.gz file</option>
%endif
%if 'bz2' in comptypes:
<option value="tbz">Download as a .tar.bz2 file</option>
@@ -442,9 +449,6 @@
>Download as a .zip file</option>
%endif
%endif
- %if can_delete:
- <option value="delete">Delete</option>
- %endif
</select><input type="submit" class="primary-button" name="action_on_datasets_button" id="action_on_datasets_button" value="Go"/></td>
--- a/templates/library/common/import_datasets_to_histories.mako Tue Mar 08 14:35:26 2011 -0500
+++ b/templates/library/common/import_datasets_to_histories.mako Tue Mar 08 15:50:19 2011 -0500
@@ -4,74 +4,53 @@
<%def name="title()">Import library datasets to histories</%def><%def name="javascripts()">
-
-${parent.javascripts()}
-${h.js( "jquery", "galaxy.base" )}
-<script type="text/javascript">
- $(function() {
- $("#select-multiple").click(function() {
- $("#single-dest-select").val("");
- $("#single-destination").hide();
- $("#multiple-destination").show();
+ ${parent.javascripts()}
+ ${h.js( "jquery", "galaxy.base" )}
+ <script type="text/javascript">
+ $(function() {
+ $("#select-multiple").click(function() {
+ $("#single-dest-select").val("");
+ $("#single-destination").hide();
+ $("#multiple-destination").show();
+ });
});
- });
-</script>
-
+ </script></%def>
%if message:
${render_msg( message, status )}
%endif
-<p>
- <div class="infomessage">Import library datasets into histories.</div>
- <div style="clear: both"></div>
-</p>
-<p>
- <form method="post">
- <div class="toolForm" style="float: left; width: 45%; padding: 0px;">
- <div class="toolFormBody">
- <input type="hidden" name="cntrller" value="${cntrller}"/>
- %if source_lddas:
- %for source_ldda in source_lddas:
- <%
- checked = ""
- encoded_id = trans.security.encode_id( source_ldda.id )
- if source_ldda.id in ldda_ids:
- checked = " checked='checked'"
- %>
- <div class="form-row">
- <input type="checkbox" name="ldda_ids" id="dataset_${encoded_id}" value="${encoded_id}" ${checked}/>
- <label for="dataset_${encoded_id}" style="display: inline;font-weight:normal;">${source_ldda.name}</label>
- </div>
- %endfor
- %else:
- <div class="form-row">This folder has no accessible library datasets.</div>
- %endif
- </div>
+<b>Import library datasets into histories</b>
+<br/><br/>
+<form action="${h.url_for( controller='library_common', action='import_datasets_to_histories', cntrller=cntrller, use_panels=use_panels, show_deleted=show_deleted )}" method="post">
+ <div class="toolForm" style="float: left; width: 45%; padding: 0px;">
+ <div class="toolFormBody">
+ %if source_lddas:
+ %for source_ldda in source_lddas:
+ <%
+ checked = ""
+ encoded_id = trans.security.encode_id( source_ldda.id )
+ if source_ldda.id in ldda_ids:
+ checked = " checked='checked'"
+ %>
+ <div class="form-row">
+ <input type="checkbox" name="ldda_ids" id="dataset_${encoded_id}" value="${encoded_id}" ${checked}/>
+ <label for="dataset_${encoded_id}" style="display: inline;font-weight:normal;">${source_ldda.name}</label>
+ </div>
+ %endfor
+ %else:
+ <div class="form-row">This folder has no accessible library datasets.</div>
+ %endif
</div>
- <div style="float: left; padding-left: 10px; font-size: 36px;">→</div>
- <div class="toolForm" style="float: right; width: 45%; padding: 0px;">
- <div class="toolFormTitle">Destination Histories:</div>
- <div class="toolFormBody">
- <div class="form-row" id="single-destination">
- <select id="single-dest-select" name="target_history_ids">
- <option value=""></option>
- %for i, target_history in enumerate( target_histories ):
- <%
- encoded_id = trans.security.encode_id( target_history.id )
- if target_history == current_history:
- current_history_text = " (current history)"
- else:
- current_history_text = ""
- %>
- <option value="${encoded_id}">${i + 1}: ${h.truncate( target_history.name, 30 )}${current_history_text}</option>
- %endfor
- </select>
- <br/><br/>
- <a style="margin-left: 10px;" href="javascript:void(0);" id="select-multiple">Choose multiple histories</a>
- </div>
- <div id="multiple-destination" style="display: none;">
+ </div>
+ <div style="float: left; padding-left: 10px; font-size: 36px;">→</div>
+ <div class="toolForm" style="float: right; width: 45%; padding: 0px;">
+ <div class="toolFormTitle">Destination Histories:</div>
+ <div class="toolFormBody">
+ <div class="form-row" id="single-destination">
+ <select id="single-dest-select" name="target_history_ids">
+ <option value=""></option>
%for i, target_history in enumerate( target_histories ):
<%
encoded_id = trans.security.encode_id( target_history.id )
@@ -80,31 +59,45 @@
else:
current_history_text = ""
%>
- <div class="form-row">
- <input type="checkbox" name="target_history_ids" id="target_history_${encoded_id}" value="${encoded_id}"/>
- <label for="target_history_${encoded_id}" style="display: inline; font-weight:normal;">${i + 1}: ${target_history.name}${current_history_text}</label>
- </div>
+ <option value="${encoded_id}">${i + 1}: ${h.truncate( target_history.name, 30 )}${current_history_text}</option>
%endfor
+ </select>
+ <br/><br/>
+ <a style="margin-left: 10px;" href="javascript:void(0);" id="select-multiple">Choose multiple histories</a>
+ </div>
+ <div id="multiple-destination" style="display: none;">
+ %for i, target_history in enumerate( target_histories ):
+ <%
+ encoded_id = trans.security.encode_id( target_history.id )
+ if target_history == current_history:
+ current_history_text = " (current history)"
+ else:
+ current_history_text = ""
+ %>
+ <div class="form-row">
+ <input type="checkbox" name="target_history_ids" id="target_history_${encoded_id}" value="${encoded_id}"/>
+ <label for="target_history_${encoded_id}" style="display: inline; font-weight:normal;">${i + 1}: ${target_history.name}${current_history_text}</label>
+ </div>
+ %endfor
+ </div>
+ %if trans.get_user():
+ <%
+ checked = ""
+ if "create_new_history" in target_history_ids:
+ checked = " checked='checked'"
+ %>
+ <hr />
+ <div style="text-align: center; color: #888;">— OR —</div>
+ <div class="form-row">
+ <label for="new_history_name" style="display: inline; font-weight:normal;">New history named:</label>
+ <input type="textbox" name="new_history_name" /></div>
- %if trans.get_user():
- <%
- checked = ""
- if "create_new_history" in target_history_ids:
- checked = " checked='checked'"
- %>
- <hr />
- <div style="text-align: center; color: #888;">— OR —</div>
- <div class="form-row">
- <label for="new_history_name" style="display: inline; font-weight:normal;">New history named:</label>
- <input type="textbox" name="new_history_name" />
- </div>
- %endif
- </div>
+ %endif
</div>
- <div style="clear: both"></div>
- <div class="form-row" align="center">
- <input type="submit" class="primary-button" name="import_datasets_to_histories_button" value="Import library datasets"/>
- </div>
- </form></div>
-</p>
+ <div style="clear: both"></div>
+ <div class="form-row" align="center">
+ <input type="submit" class="primary-button" name="import_datasets_to_histories_button" value="Import library datasets"/>
+ </div>
+ </form>
+</div>
--- a/templates/library/common/library_dataset_search_results.mako Tue Mar 08 14:35:26 2011 -0500
+++ b/templates/library/common/library_dataset_search_results.mako Tue Mar 08 15:50:19 2011 -0500
@@ -95,7 +95,7 @@
<p>The string "${search_term}" was found in at least one of the following information components of the displayed library datasets.</p>
${render_searched_components()}
<form name="act_on_multiple_datasets" action="${h.url_for( controller='library_common', action='act_on_multiple_datasets', cntrller=cntrller, use_panels=use_panels, show_deleted=show_deleted )}" onSubmit="javascript:return checkForm();" method="post">
- <input type="hidden" name=search_term value="${search_term}"/>
+ <input type="hidden" name="search_term" value="${search_term}"/><table cellspacing="0" cellpadding="0" border="0" width="100%" class="grid" id="library-grid"><thead><tr class="libraryTitle">
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0