galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
June 2011
- 1 participants
- 136 discussions
commit/galaxy-central: greg: Fix bug introduced in 5695: 8731db1b2bfb - hda.history must have a user before the check for hda.dataset.has_manage_permissions_roles.
by Bitbucket 14 Jun '11
by Bitbucket 14 Jun '11
14 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/036eace358b7/
changeset: 036eace358b7
user: greg
date: 2011-06-14 15:52:05
summary: Fix bug introduced in 5695: 8731db1b2bfb - hda.history must have a user before the check for hda.dataset.has_manage_permissions_roles.
affected #: 1 file (22 bytes)
--- a/lib/galaxy/web/controllers/root.py Mon Jun 13 16:07:04 2011 -0400
+++ b/lib/galaxy/web/controllers/root.py Tue Jun 14 09:52:05 2011 -0400
@@ -316,7 +316,7 @@
if id is not None and data.history.user is not None and data.history.user != trans.user:
return trans.show_error_message( "This instance of a dataset (%s) in a history does not belong to you." % ( data.id ) )
current_user_roles = trans.get_current_user_roles()
- if not data.dataset.has_manage_permissions_roles( trans ):
+ if data.history.user and not data.dataset.has_manage_permissions_roles( trans ):
# Permission setting related to DATASET_MANAGE_PERMISSIONS was broken for a period of time,
# so it is possible that some Datasets have no roles associated with the DATASET_MANAGE_PERMISSIONS
# permission. In this case, we'll reset this permission to the hda user's private role.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Security agent bu fix - eliminate an unwanted line # 481 - no idea where it came from...
by Bitbucket 13 Jun '11
by Bitbucket 13 Jun '11
13 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/cc71d91417b8/
changeset: cc71d91417b8
user: greg
date: 2011-06-13 22:07:04
summary: Security agent bu fix - eliminate an unwanted line # 481 - no idea where it came from...
affected #: 1 file (51 bytes)
--- a/lib/galaxy/security/__init__.py Mon Jun 13 14:05:41 2011 +0200
+++ b/lib/galaxy/security/__init__.py Mon Jun 13 16:07:04 2011 -0400
@@ -479,7 +479,6 @@
Set new permissions on a dataset, eliminating all current permissions
permissions looks like: { Action : [ Role, Role ] }
"""
- for action, roles in permissions.items():
# Make sure that DATASET_MANAGE_PERMISSIONS is associated with at least 1 role
has_dataset_manage_permissions = False
for action, roles in permissions.items():
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Peter van Heusden: Added TwoBit datatype for twobit binary nucleotide datatype. Sniffer code
by Bitbucket 13 Jun '11
by Bitbucket 13 Jun '11
13 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/b319d5c44acf/
changeset: b319d5c44acf
user: Peter van Heusden
date: 2011-06-13 14:05:41
summary: Added TwoBit datatype for twobit binary nucleotide datatype. Sniffer code
based on bx-python's bx.seq.twobit.
affected #: 2 files (1.3 KB)
--- a/datatypes_conf.xml.sample Mon Jun 13 10:47:51 2011 -0400
+++ b/datatypes_conf.xml.sample Mon Jun 13 14:05:41 2011 +0200
@@ -116,6 +116,7 @@
<datatype extension="svg" type="galaxy.datatypes.images:Image" mimetype="image/svg+xml"/><datatype extension="taxonomy" type="galaxy.datatypes.tabular:Taxonomy" display_in_upload="true"/><datatype extension="tabular" type="galaxy.datatypes.tabular:Tabular" display_in_upload="true"/>
+ <datatype extension="twobit" type="galaxy.datatypes.binary:TwoBit" mimetype="application/octet-stream" display_in_upload="true"/><datatype extension="txt" type="galaxy.datatypes.data:Text" display_in_upload="true"/><datatype extension="memexml" type="galaxy.datatypes.xml:MEMEXml" mimetype="application/xml" display_in_upload="true"/><datatype extension="blastxml" type="galaxy.datatypes.xml:BlastXml" mimetype="application/xml" display_in_upload="true"/>
@@ -279,6 +280,7 @@
defined format first, followed by next-most rigidly defined,
and so on.
-->
+ <sniffer type="galaxy.datatypes.binary:TwoBit"/><sniffer type="galaxy.datatypes.binary:Bam"/><sniffer type="galaxy.datatypes.binary:Sff"/><sniffer type="galaxy.datatypes.xml:BlastXml"/>
--- a/lib/galaxy/datatypes/binary.py Mon Jun 13 10:47:51 2011 -0400
+++ b/lib/galaxy/datatypes/binary.py Mon Jun 13 14:05:41 2011 +0200
@@ -6,6 +6,10 @@
from galaxy.datatypes.metadata import MetadataElement
from galaxy.datatypes import metadata
from galaxy.datatypes.sniff import *
+from galaxy import eggs
+import pkg_resources
+pkg_resources.require( "bx-python" )
+from bx.seq.twobit import TWOBIT_MAGIC_NUMBER, TWOBIT_MAGIC_NUMBER_SWAP, TWOBIT_MAGIC_SIZE
from urllib import urlencode, quote_plus
import zipfile, gzip
import os, subprocess, tempfile
@@ -292,3 +296,29 @@
def get_track_type( self ):
return "LineTrack", {"data_standalone": "bigbed"}
+class TwoBit (Binary):
+ """Class describing a TwoBit format nucleotide file"""
+
+ file_ext = "twobit"
+
+ def sniff(self, filename):
+ try:
+ input = file(filename)
+ magic = struct.unpack(">L", input.read(TWOBIT_MAGIC_SIZE))[0]
+ if magic == TWOBIT_MAGIC_NUMBER or magic == TWOBIT_MAGIC_NUMBER_SWAP:
+ return True
+ except IOError:
+ return False
+
+ def set_peek(self, dataset, is_multi_byte=False):
+ if not dataset.dataset.purged:
+ dataset.peek = "Binary TwoBit format nucleotide file"
+ dataset.blurb = data.nice_size(dataset.get_size())
+ else:
+ return super(TwoBit, self).set_peek(dataset, is_multi_byte)
+
+ def display_peek(self, dataset):
+ try:
+ return dataset.peek
+ except:
+ return "Binary TwoBit format nucleotide file (%s)" % (data.nice_size(dataset.get_size()))
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: No longer allow Datasets to have DatasetPermissions set such that no roles are associated with the DATASET_MANAGE_PERMISSION.
by Bitbucket 13 Jun '11
by Bitbucket 13 Jun '11
13 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/8731db1b2bfb/
changeset: 8731db1b2bfb
user: greg
date: 2011-06-13 16:47:51
summary: No longer allow Datasets to have DatasetPermissions set such that no roles are associated with the DATASET_MANAGE_PERMISSION.
Include the automatic creation of a new DatasetPermission for an hda where the DATASET_MANAGE_PERMISSION permission is associated with the hda.history.user's private role if the hda has no roles associated with the DATASET_MANAGE_PERMISSION permission. The creation of the DatasetPermission occurs when the user clicks the pencil icon for the hda.
affected #: 10 files (5.2 KB)
--- a/lib/galaxy/jobs/deferred/data_transfer.py Mon Jun 13 09:51:56 2011 -0400
+++ b/lib/galaxy/jobs/deferred/data_transfer.py Mon Jun 13 10:47:51 2011 -0400
@@ -108,7 +108,7 @@
ld = self.app.model.LibraryDataset( folder=sample.folder, name=library_dataset_name )
self.sa_session.add( ld )
self.sa_session.flush()
- self.app.security_agent.copy_library_permissions( sample.folder, ld )
+ self.app.security_agent.copy_library_permissions( FakeTrans( self.app ), sample.folder, ld )
ldda = self.app.model.LibraryDatasetDatasetAssociation( name = library_dataset_name,
extension = extension,
dbkey = '?',
--- a/lib/galaxy/model/__init__.py Mon Jun 13 09:51:56 2011 -0400
+++ b/lib/galaxy/model/__init__.py Mon Jun 13 10:47:51 2011 -0400
@@ -594,6 +594,17 @@
if dp.action == trans.app.security_agent.permitted_actions.DATASET_ACCESS.action:
roles.append( dp.role )
return roles
+ def get_manage_permissions_roles( self, trans ):
+ roles = []
+ for dp in self.actions:
+ if dp.action == trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS.action:
+ roles.append( dp.role )
+ return roles
+ def has_manage_permissions_roles( self, trans ):
+ for dp in self.actions:
+ if dp.action == trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS.action:
+ return True
+ return False
class DatasetInstance( object ):
"""A base class for all 'dataset instances', HDAs, LDAs, etc"""
@@ -1265,6 +1276,10 @@
return
def get_access_roles( self, trans ):
return self.dataset.get_access_roles( trans )
+ def get_manage_permissions_roles( self, trans ):
+ return self.dataset.get_manage_permissions_roles( trans )
+ def has_manage_permissions_roles( self, trans ):
+ return self.dataset.has_manage_permissions_roles( trans )
def get_info_association( self, restrict=False, inherited=False ):
# If restrict is True, we will return this ldda's info_association whether it
# exists or not ( in which case None will be returned ). If restrict is False,
--- a/lib/galaxy/security/__init__.py Mon Jun 13 09:51:56 2011 -0400
+++ b/lib/galaxy/security/__init__.py Mon Jun 13 10:47:51 2011 -0400
@@ -70,7 +70,7 @@
raise "Unimplemented Method"
def set_dataset_permission( self, dataset, permission ):
raise "Unimplemented Method"
- def set_all_library_permissions( self, dataset, permissions ):
+ def set_all_library_permissions( self, trans, dataset, permissions ):
raise "Unimplemented Method"
def library_is_public( self, library ):
raise "Unimplemented Method"
@@ -479,6 +479,19 @@
Set new permissions on a dataset, eliminating all current permissions
permissions looks like: { Action : [ Role, Role ] }
"""
+ for action, roles in permissions.items():
+ # Make sure that DATASET_MANAGE_PERMISSIONS is associated with at least 1 role
+ has_dataset_manage_permissions = False
+ for action, roles in permissions.items():
+ if isinstance( action, Action ):
+ if action == self.permitted_actions.DATASET_MANAGE_PERMISSIONS and roles:
+ has_dataset_manage_permissions = True
+ break
+ elif action == self.permitted_actions.DATASET_MANAGE_PERMISSIONS.action and roles:
+ has_dataset_manage_permissions = True
+ break
+ if not has_dataset_manage_permissions:
+ return "At least 1 role must be associated with the <b>manage permissions</b> permission on this dataset."
flush_needed = False
# Delete all of the current permissions on the dataset
for dp in dataset.actions:
@@ -493,6 +506,7 @@
flush_needed = True
if flush_needed:
self.sa_session.flush()
+ return ""
def set_dataset_permission( self, dataset, permission={} ):
"""
Set a specific permission on a dataset, leaving all other current permissions on the dataset alone
@@ -580,7 +594,7 @@
for user in users:
self.associate_components( user=user, role=sharing_role )
self.set_dataset_permission( dataset, { self.permitted_actions.DATASET_ACCESS : [ sharing_role ] } )
- def set_all_library_permissions( self, library_item, permissions={} ):
+ def set_all_library_permissions( self, trans, library_item, permissions={} ):
# Set new permissions on library_item, eliminating all current permissions
flush_needed = False
for role_assoc in library_item.actions:
@@ -595,14 +609,21 @@
for role_assoc in [ permission_class( action, library_item, role ) for role in roles ]:
self.sa_session.add( role_assoc )
flush_needed = True
- if isinstance( library_item, self.model.LibraryDatasetDatasetAssociation ) and \
- action == self.permitted_actions.LIBRARY_MANAGE.action:
- # Handle the special case when we are setting the LIBRARY_MANAGE_PERMISSION on a
- # library_dataset_dataset_association since the roles need to be applied to the
- # DATASET_MANAGE_PERMISSIONS permission on the associated dataset
- permissions = {}
- permissions[ self.permitted_actions.DATASET_MANAGE_PERMISSIONS ] = roles
- self.set_dataset_permission( library_item.dataset, permissions )
+ if isinstance( library_item, self.model.LibraryDatasetDatasetAssociation ):
+ # Permission setting related to DATASET_MANAGE_PERMISSIONS was broken for a period of time,
+ # so it is possible that some Datasets have no roles associated with the DATASET_MANAGE_PERMISSIONS
+ # permission. In this case, we'll reset this permission to the library_item user's private role.
+ if not library_item.dataset.has_manage_permissions_roles( trans ):
+ permission = {}
+ permissions[ self.permitted_actions.DATASET_MANAGE_PERMISSIONS ] = [ trans.app.security_agent.get_private_user_role( library_item.user ) ]
+ self.set_dataset_permission( library_item.dataset, permissions )
+ if action == self.permitted_actions.LIBRARY_MANAGE.action and roles:
+ # Handle the special case when we are setting the LIBRARY_MANAGE_PERMISSION on a
+ # library_dataset_dataset_association since the roles need to be applied to the
+ # DATASET_MANAGE_PERMISSIONS permission on the associated dataset.
+ permissions = {}
+ permissions[ self.permitted_actions.DATASET_MANAGE_PERMISSIONS ] = roles
+ self.set_dataset_permission( library_item.dataset, permissions )
if flush_needed:
self.sa_session.flush()
def library_is_public( self, library, contents=False ):
@@ -752,7 +773,7 @@
else:
permissions[ self.get_action( v.action ) ] = in_roles
return permissions, in_roles, error, msg
- def copy_library_permissions( self, source_library_item, target_library_item, user=None ):
+ def copy_library_permissions( self, trans, source_library_item, target_library_item, user=None ):
# Copy all relevant permissions from source.
permissions = {}
for role_assoc in source_library_item.actions:
@@ -762,7 +783,7 @@
permissions[role_assoc.action].append( role_assoc.role )
else:
permissions[role_assoc.action] = [ role_assoc.role ]
- self.set_all_library_permissions( target_library_item, permissions )
+ self.set_all_library_permissions( trans, target_library_item, permissions )
if user:
item_class = None
for item_class, permission_class in self.library_item_assocs:
--- a/lib/galaxy/tools/actions/upload_common.py Mon Jun 13 09:51:56 2011 -0400
+++ b/lib/galaxy/tools/actions/upload_common.py Mon Jun 13 10:47:51 2011 -0400
@@ -143,7 +143,7 @@
folder.add_folder( new_folder )
trans.sa_session.add( new_folder )
trans.sa_session.flush()
- trans.app.security_agent.copy_library_permissions( folder, new_folder )
+ trans.app.security_agent.copy_library_permissions( trans, folder, new_folder )
folder = new_folder
if library_bunch.replace_dataset:
ld = library_bunch.replace_dataset
@@ -151,7 +151,7 @@
ld = trans.app.model.LibraryDataset( folder=folder, name=uploaded_dataset.name )
trans.sa_session.add( ld )
trans.sa_session.flush()
- trans.app.security_agent.copy_library_permissions( folder, ld )
+ trans.app.security_agent.copy_library_permissions( trans, folder, ld )
ldda = trans.app.model.LibraryDatasetDatasetAssociation( name = uploaded_dataset.name,
extension = uploaded_dataset.file_type,
dbkey = uploaded_dataset.dbkey,
@@ -167,7 +167,7 @@
ldda.message = library_bunch.message
trans.sa_session.flush()
# Permissions must be the same on the LibraryDatasetDatasetAssociation and the associated LibraryDataset
- trans.app.security_agent.copy_library_permissions( ld, ldda )
+ trans.app.security_agent.copy_library_permissions( trans, ld, ldda )
if library_bunch.replace_dataset:
# Copy the Dataset level permissions from replace_dataset to the new LibraryDatasetDatasetAssociation.dataset
trans.app.security_agent.copy_dataset_permissions( library_bunch.replace_dataset.library_dataset_dataset_association.dataset, ldda.dataset )
--- a/lib/galaxy/web/controllers/library_common.py Mon Jun 13 09:51:56 2011 -0400
+++ b/lib/galaxy/web/controllers/library_common.py Mon Jun 13 10:47:51 2011 -0400
@@ -232,10 +232,10 @@
for k, v in trans.app.model.Library.permitted_actions.items():
in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
- trans.app.security_agent.set_all_library_permissions( library, permissions )
+ trans.app.security_agent.set_all_library_permissions( trans, library, permissions )
trans.sa_session.refresh( library )
# Copy the permissions to the root folder
- trans.app.security_agent.copy_library_permissions( library, library.root_folder )
+ trans.app.security_agent.copy_library_permissions( trans, library, library.root_folder )
message = "Permissions updated for library '%s'." % library.name
return trans.response.send_redirect( web.url_for( controller='library_common',
action='library_permissions',
@@ -285,7 +285,7 @@
trans.sa_session.add( new_folder )
trans.sa_session.flush()
# New folders default to having the same permissions as their parent folder
- trans.app.security_agent.copy_library_permissions( parent_folder, new_folder )
+ trans.app.security_agent.copy_library_permissions( trans, parent_folder, new_folder )
# If we're creating in the API, we're done
if cntrller == 'api':
return 200, dict( created=new_folder )
@@ -411,7 +411,7 @@
# and it is not inherited.
in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( int( x ) ) for x in util.listify( params.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
- trans.app.security_agent.set_all_library_permissions( folder, permissions )
+ trans.app.security_agent.set_all_library_permissions( trans, folder, permissions )
trans.sa_session.refresh( folder )
message = "Permissions updated for folder '%s'." % folder.name
return trans.response.send_redirect( web.url_for( controller='library_common',
@@ -625,33 +625,50 @@
else:
roles = trans.app.security_agent.get_legitimate_roles( trans, ldda.dataset, cntrller )
if params.get( 'update_roles_button', False ):
- a = trans.app.security_agent.get_action( trans.app.security_agent.permitted_actions.DATASET_ACCESS.action )
+ # Dataset permissions
+ access_action = trans.app.security_agent.get_action( trans.app.security_agent.permitted_actions.DATASET_ACCESS.action )
+ manage_permissions_action = trans.app.security_agent.get_action( trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS.action )
permissions, in_roles, error, message = \
trans.app.security_agent.derive_roles_from_access( trans, trans.app.security.decode_id( library_id ), cntrller, library=True, **kwd )
+ # Keep roles for DATASET_MANAGE_PERMISSIONS on the dataset
+ if not ldda.has_manage_permissions_roles( trans ):
+ # Permission setting related to DATASET_MANAGE_PERMISSIONS was broken for a period of time,
+ # so it is possible that some Datasets have no roles associated with the DATASET_MANAGE_PERMISSIONS
+ # permission. In this case, we'll reset this permission to the ldda user's private role.
+ #dataset_manage_permissions_roles = [ trans.app.security_agent.get_private_user_role( ldda.user ) ]
+ permissions[ manage_permissions_action ] = [ trans.app.security_agent.get_private_user_role( ldda.user ) ]
+ else:
+ permissions[ manage_permissions_action ] = ldda.get_manage_permissions_roles( trans )
for ldda in lddas:
# Set the DATASET permissions on the Dataset.
if error:
# Keep the original role associations for the DATASET_ACCESS permission on the ldda.
- permissions[ a ] = ldda.get_access_roles( trans )
- trans.app.security_agent.set_all_dataset_permissions( ldda.dataset, permissions )
- trans.sa_session.refresh( ldda.dataset )
- # Set the LIBRARY permissions on the LibraryDataset. The LibraryDataset and
- # LibraryDatasetDatasetAssociation will be set with the same permissions.
- permissions = {}
- for k, v in trans.app.model.Library.permitted_actions.items():
- if k != 'LIBRARY_ACCESS':
- # LIBRARY_ACCESS is a special permission set only at the library level and it is not inherited.
- in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
- permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
- for ldda in lddas:
- trans.app.security_agent.set_all_library_permissions( ldda.library_dataset, permissions )
- trans.sa_session.refresh( ldda.library_dataset )
- # Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
- trans.app.security_agent.set_all_library_permissions( ldda, permissions )
- trans.sa_session.refresh( ldda )
- if error:
- status = 'error'
- else:
+ permissions[ access_action ] = ldda.get_access_roles( trans )
+ status = 'error'
+ else:
+ error = trans.app.security_agent.set_all_dataset_permissions( ldda.dataset, permissions )
+ if error:
+ message += error
+ status = 'error'
+ trans.sa_session.refresh( ldda.dataset )
+ if not error:
+ # Set the LIBRARY permissions on the LibraryDataset. The LibraryDataset and
+ # LibraryDatasetDatasetAssociation will be set with the same permissions.
+ permissions = {}
+ for k, v in trans.app.model.Library.permitted_actions.items():
+ if k != 'LIBRARY_ACCESS':
+ # LIBRARY_ACCESS is a special permission set only at the library level and it is not inherited.
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
+ permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
+ for ldda in lddas:
+ error = trans.app.security_agent.set_all_library_permissions( trans, ldda.library_dataset, permissions )
+ trans.sa_session.refresh( ldda.library_dataset )
+ if error:
+ message = error
+ else:
+ # Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
+ trans.app.security_agent.set_all_library_permissions( trans, ldda, permissions )
+ trans.sa_session.refresh( ldda )
if len( lddas ) == 1:
message = "Permissions updated for dataset '%s'." % ldda.name
else:
@@ -1211,8 +1228,8 @@
if not replace_dataset:
# If replace_dataset is None, the Library level permissions will be taken from the folder and applied to the new
# LDDA and LibraryDataset.
- trans.app.security_agent.copy_library_permissions( folder, ldda )
- trans.app.security_agent.copy_library_permissions( folder, ldda.library_dataset )
+ trans.app.security_agent.copy_library_permissions( trans, folder, ldda )
+ trans.app.security_agent.copy_library_permissions( trans, folder, ldda.library_dataset )
# Make sure to apply any defined dataset permissions, allowing the permissions inherited from the folder to
# over-ride the same permissions on the dataset, if they exist.
dataset_permissions_dict = trans.app.security_agent.get_permissions( hda.dataset )
@@ -1235,7 +1252,7 @@
if flush_needed:
trans.sa_session.flush()
# Permissions must be the same on the LibraryDatasetDatasetAssociation and the associated LibraryDataset
- trans.app.security_agent.copy_library_permissions( ldda.library_dataset, ldda )
+ trans.app.security_agent.copy_library_permissions( trans, ldda.library_dataset, ldda )
if created_ldda_ids:
created_ldda_ids = created_ldda_ids.lstrip( ',' )
ldda_id_list = created_ldda_ids.split( ',' )
@@ -1478,13 +1495,17 @@
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
# Set the LIBRARY permissions on the LibraryDataset
# NOTE: the LibraryDataset and LibraryDatasetDatasetAssociation will be set with the same permissions
- trans.app.security_agent.set_all_library_permissions( library_dataset, permissions )
+ error = trans.app.security_agent.set_all_library_permissions( trans, library_dataset, permissions )
trans.sa_session.refresh( library_dataset )
- # Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
- trans.app.security_agent.set_all_library_permissions( library_dataset.library_dataset_dataset_association, permissions )
- trans.sa_session.refresh( library_dataset.library_dataset_dataset_association )
- message = "Permisisons updated for library dataset '%s'." % library_dataset.name
- status = 'done'
+ if error:
+ message = error
+ status = 'error'
+ else:
+ # Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
+ trans.app.security_agent.set_all_library_permissions( trans, library_dataset.library_dataset_dataset_association, permissions )
+ trans.sa_session.refresh( library_dataset.library_dataset_dataset_association )
+ message = "Permisisons updated for library dataset '%s'." % library_dataset.name
+ status = 'done'
roles = trans.app.security_agent.get_legitimate_roles( trans, library_dataset, cntrller )
return trans.fill_template( '/library/common/library_dataset_permissions.mako',
cntrller=cntrller,
--- a/lib/galaxy/web/controllers/requests_common.py Mon Jun 13 09:51:56 2011 -0400
+++ b/lib/galaxy/web/controllers/requests_common.py Mon Jun 13 10:47:51 2011 -0400
@@ -1050,7 +1050,7 @@
def __import_samples( self, trans, cntrller, request, displayable_sample_widgets, libraries, workflows, **kwd ):
"""
Reads the samples csv file and imports all the samples. The format of the csv file is:
- SampleName,DataLibraryName,DataLibraryFolderName,HistoryName,WorkflowName,FieldValue1,FieldValue2...
+ SampleName,DataLibraryName,DataLibraryFolderName,HistoryName,WorkflowName,Field1Name,Field2Name...
"""
params = util.Params( kwd )
current_user_roles = trans.get_current_user_roles()
--- a/lib/galaxy/web/controllers/root.py Mon Jun 13 09:51:56 2011 -0400
+++ b/lib/galaxy/web/controllers/root.py Mon Jun 13 10:47:51 2011 -0400
@@ -316,6 +316,13 @@
if id is not None and data.history.user is not None and data.history.user != trans.user:
return trans.show_error_message( "This instance of a dataset (%s) in a history does not belong to you." % ( data.id ) )
current_user_roles = trans.get_current_user_roles()
+ if not data.dataset.has_manage_permissions_roles( trans ):
+ # Permission setting related to DATASET_MANAGE_PERMISSIONS was broken for a period of time,
+ # so it is possible that some Datasets have no roles associated with the DATASET_MANAGE_PERMISSIONS
+ # permission. In this case, we'll reset this permission to the hda user's private role.
+ manage_permissions_action = trans.app.security_agent.get_action( trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS.action )
+ permissions = { manage_permissions_action : [ trans.app.security_agent.get_private_user_role( data.history.user ) ] }
+ trans.app.security_agent.set_dataset_permission( data.dataset, permissions )
if trans.app.security_agent.can_access_dataset( current_user_roles, data.dataset ):
if data.state == trans.model.Dataset.states.UPLOAD:
return trans.show_error_message( "Please wait until this dataset finishes uploading before attempting to edit its metadata." )
@@ -394,18 +401,24 @@
if not trans.user:
return trans.show_error_message( "You must be logged in if you want to change permissions." )
if trans.app.security_agent.can_manage_dataset( current_user_roles, data.dataset ):
+ access_action = trans.app.security_agent.get_action( trans.app.security_agent.permitted_actions.DATASET_ACCESS.action )
+ manage_permissions_action = trans.app.security_agent.get_action( trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS.action )
# The user associated the DATASET_ACCESS permission on the dataset with 1 or more roles. We
# need to ensure that they did not associate roles that would cause accessibility problems.
permissions, in_roles, error, message = \
trans.app.security_agent.derive_roles_from_access( trans, data.dataset.id, 'root', **kwd )
- a = trans.app.security_agent.get_action( trans.app.security_agent.permitted_actions.DATASET_ACCESS.action )
if error:
# Keep the original role associations for the DATASET_ACCESS permission on the dataset.
- permissions[ a ] = data.dataset.get_access_roles( trans )
- trans.app.security_agent.set_all_dataset_permissions( data.dataset, permissions )
+ permissions[ access_action ] = data.dataset.get_access_roles( trans )
+ status = 'error'
+ else:
+ error = trans.app.security_agent.set_all_dataset_permissions( data.dataset, permissions )
+ if error:
+ message += error
+ status = 'error'
+ else:
+ message = 'Your changes completed successfully.'
trans.sa_session.refresh( data.dataset )
- if not message:
- message = 'Your changes completed successfully.'
else:
return trans.show_error_message( "You are not authorized to change this dataset's permissions" )
if "dbkey" in data.datatype.metadata_spec and not data.metadata.dbkey:
--- a/templates/requests/common/add_samples.mako Mon Jun 13 09:51:56 2011 -0400
+++ b/templates/requests/common/add_samples.mako Mon Jun 13 10:47:51 2011 -0400
@@ -118,7 +118,7 @@
<input type="submit" name="import_samples_button" value="Import samples"/><div class="toolParamHelp" style="clear: both;">
The csv file must be in the following format:<br/>
- SampleName,DataLibraryName,DataLibraryFolderName,HistoryName,WorkflowName,FieldValue1,FieldValue2...
+ SampleName,DataLibraryName,DataLibraryFolderName,HistoryName,WorkflowName,Field1Name,Field2Name...
</div></div></form>
--- a/test/functional/test_library_security.py Mon Jun 13 09:51:56 2011 -0400
+++ b/test/functional/test_library_security.py Mon Jun 13 10:47:51 2011 -0400
@@ -420,11 +420,10 @@
str( role1.id ),
permissions_in=[ 'DATASET_ACCESS' ],
strings_displayed=[ 'Permissions updated for 3 datasets.' ] )
+ # Even though we've eliminated the roles associated with the LIBRARY_MANAGE_PERMISSIONS permission,
+ # none of the roles associated with the DATASET_MANAGE permission sould have been changed.
check_edit_page( latest_3_lddas,
- strings_displayed=[ 'View Permissions' ],
- strings_not_displayed=[ 'Manage dataset permissions on',
- 'can manage roles associated with permissions on this library item',
- 'can import this dataset into their history for analysis' ] )
+ strings_displayed=[ 'manage permissions' ] )
def test_060_restrict_access_to_library2( self ):
"""Testing restricting access to library2"""
# Logged in as admin_user
--- a/test/functional/test_sample_tracking.py Mon Jun 13 09:51:56 2011 -0400
+++ b/test/functional/test_sample_tracking.py Mon Jun 13 10:47:51 2011 -0400
@@ -869,7 +869,6 @@
sample1_dataset.transfer_status.NOT_STARTED ]
self.view_sample_dataset( sample_dataset_id=self.security.encode_id( sample1_dataset.id ),
strings_displayed=strings_displayed )
- '''
def test_999_reset_data_for_later_test_runs( self ):
"""Reseting data to enable later test runs to pass"""
# Logged in as admin_user
@@ -935,4 +934,3 @@
# Manually delete the group from the database
refresh( group )
delete_obj( group )
- '''
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
13 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/1c1ff3753d5e/
changeset: 1c1ff3753d5e
user: natefoo
date: 2011-06-13 15:51:56
summary: Fix missing log setup in migrate script 74
affected #: 1 file (52 bytes)
--- a/lib/galaxy/model/migrate/versions/0074_add_purged_column_to_library_dataset_table.py Fri Jun 10 20:10:09 2011 -0400
+++ b/lib/galaxy/model/migrate/versions/0074_add_purged_column_to_library_dataset_table.py Mon Jun 13 09:51:56 2011 -0400
@@ -7,6 +7,9 @@
from migrate import *
from migrate.changeset import *
+import logging
+log = logging.getLogger( __name__ )
+
metadata = MetaData( migrate_engine )
db_session = scoped_session( sessionmaker( bind=migrate_engine, autoflush=False, autocommit=True ) )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/8bcc0877b39b/
changeset: 8bcc0877b39b
user: kanwei
date: 2011-06-11 02:10:09
summary: Add missing test files
affected #: 3 files (0 bytes)
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: kellyv: Added builds and length info to manual builds list
by Bitbucket 10 Jun '11
by Bitbucket 10 Jun '11
10 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/be2777f9688b/
changeset: be2777f9688b
user: kellyv
date: 2011-06-10 22:31:52
summary: Added builds and length info to manual builds list
affected #: 1 file (3.5 KB)
--- a/tool-data/shared/ucsc/manual_builds.txt Fri Jun 10 14:42:41 2011 -0400
+++ b/tool-data/shared/ucsc/manual_builds.txt Fri Jun 10 16:31:52 2011 -0400
@@ -699,5 +699,6 @@
Araly1 Arabidopsis lyrata
Zea_mays_B73_RefGen_v2 Maize (Zea mays) chr1=301354135,chr2=237068928,chr3=232140222,chr4=241473566,chr5=217872898,chr6=169174371,chr7=176764813,chr8=175793772,chr9=156750718,chr10=150189513,chr11=7140224
Homo_sapiens_AK1 Korean Man chrM=16571,chr1=247249719,chr2=242951149,chr3=199501827,chr4=191273063,chr5=180857866,chr6=170899992,chr7=158821424,chr8=146274826,chr9=140273252,chr10=135374737,chr11=134452384,chr12=132349534,chr13=114142980,chr14=106368585,chr15=100338915,chr16=88827254,chr17=78774742,chr18=76117153,chr19=63811651,chr20=62435964,chr21=46944323,chr22=49691432,chrX=154913754,chrY=57772954
-Tcas_3.0 Red Flour Beetle (Tribolium castaneum)
-hg_g1k_v37 Homo sapiens b37
+Tcas_3.0 Red Flour Beetle (Tribolium castaneum) chrLG1=X=10877635,chrLG2=20218415,chrLG3=38791480,chrLG4=13894384,chrLG5=19135781,chrLG6=13176827,chrLG7=20532854,chrLG8=18021898,chrLG9=21459655,chrLG10=11386040
+hg_g1k_v37 Homo sapiens b37 1=249250621,2=243199373,3=198022430,4=191154276,5=180915260,6=171115067,7=159138663,8=146364022,9=141213431,10=135534747,11=135006516,12=133851895,13=115169878,14=107349540,15=102531392,16=90354753,17=81195210,18=78077248,19=59128983,20=63025520,21=48129895,22=51304566,X=155270560,Y=59373566,MT=16569,GL000207.1=4262,GL000226.1=15008,GL000229.1=19913,GL000231.1=27386,GL000210.1=27682,GL000239.1=33824,GL000235.1=34474,GL000201.1=36148,GL000247.1=36422,GL000245.1=36651,GL000197.1=37175,GL000203.1=37498,GL000246.1=38154,GL000249.1=38502,GL000196.1=38914,GL000248.1=39786,GL000244.1=39929,GL000238.1=39939,GL000202.1=40103,GL000234.1=40531,GL000232.1=40652,GL000206.1=41001,GL000240.1=41933,GL000236.1=41934,GL000241.1=42152,GL000243.1=43341,GL000242.1=43523,GL000230.1=43691,GL000237.1=45867,GL000233.1=45941,GL000204.1=81310,GL000198.1=90085,GL000208.1=92689,GL000191.1=106433,GL000227.1=128374,GL000228.1=129120,GL000214.1=137718,GL000221.1=155397,GL000209.1=159169,GL000218.1=161147,GL000220.1=161802,GL000213.1=164239,GL000211.1=166566,GL000199.1=169874,GL000217.1=172149,GL000216.1=172294,GL000215.1=172545,GL000205.1=174588,GL000219.1=179198,GL000224.1=179693,GL000223.1=180455,GL000195.1=182896,GL000212.1=186858,GL000222.1=186861,GL000200.1=187035,GL000193.1=189789,GL000194.1=191469,GL000225.1=211173,GL000192.1=547496
+Homo_sapiens_nuHg19_mtrCRS Homo sapiens (hg19 with mtDNA replaced with rCRS) chr1=249250621,chr2=243199373,chr3=198022430,chr4=191154276,chr5=180915260,chr6=171115067,chr7=159138663,chr8=146364022,chr9=141213431,chr10=135534747,chr11=135006516,chr12=133851895,chr13=115169878,chr14=107349540,chr15=102531392,chr16=90354753,chr17=81195210,chr18=78077248,chr19=59128983,chr20=63025520,chr21=48129895,chr22=51304566,chrX=155270560,chrY=59373566,chrM=16569,chr1_gl000191_random=106433,chr1_gl000192_random=547496,chr4_ctg9_hap1=590426,chr4_gl000193_random=189789,chr4_gl000194_random=191469,chr6_apd_hap1=4622290,chr6_cox_hap2=4795371,chr6_dbb_hap3=4610396,chr6_mann_hap4=4683263,chr6_mcf_hap5=4833398,chr6_qbl_hap6=4611984,chr6_ssto_hap7=4928567,chr7_gl000195_random=182896,chr8_gl000196_random=38914,chr8_gl000197_random=37175,chr9_gl000198_random=90085,chr9_gl000199_random=169874,chr9_gl000200_random=187035,chr9_gl000201_random=36148,chr11_gl000202_random=40103,chr17_ctg5_hap1=1680828,chr17_gl000203_random=37498,chr17_gl000204_random=81310,chr17_gl000205_random=174588,chr17_gl000206_random=41001,chr18_gl000207_random=4262,chr19_gl000208_random=92689,chr19_gl000209_random=159169,chr21_gl000210_random=27682,chrUn_gl000211=166566,chrUn_gl000212=186858,chrUn_gl000213=164239,chrUn_gl000214=137718,chrUn_gl000215=172545,chrUn_gl000216=172294,chrUn_gl000217=172149,chrUn_gl000218=161147,chrUn_gl000219=179198,chrUn_gl000220=161802,chrUn_gl000221=155397,chrUn_gl000222=186861,chrUn_gl000223=180455,chrUn_gl000224=179693,chrUn_gl000225=211173,chrUn_gl000226=15008,chrUn_gl000227=128374,chrUn_gl000228=129120,chrUn_gl000229=19913,chrUn_gl000230=43691,chrUn_gl000231=27386,chrUn_gl000232=40652,chrUn_gl000233=45941,chrUn_gl000234=40531,chrUn_gl000235=34474,chrUn_gl000236=41934,chrUn_gl000237=45867,chrUn_gl000238=39939,chrUn_gl000239=33824,chrUn_gl000240=41933,chrUn_gl000241=42152,chrUn_gl000242=43523,chrUn_gl000243=43341,chrUn_gl000244=39929,chrUn_gl000245=36651,chrUn_gl000246=38154,chrUn_gl000247=36422,chrUn_gl000248=39786,chrUn_gl000249=38502
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: anton: Fix to sam2interval bug reported by Kathy So. The bug was causing sam2interval to successfully parse unmapped sam entries, which have no valid coordinate.
by Bitbucket 10 Jun '11
by Bitbucket 10 Jun '11
10 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/076fe3217342/
changeset: 076fe3217342
user: anton
date: 2011-06-10 20:42:41
summary: Fix to sam2interval bug reported by Kathy So. The bug was causing sam2interval to successfully parse unmapped sam entries, which have no valid coordinate.
affected #: 4 files (334 bytes)
Diff too large to display.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Add bookmarking to Trackster. Also group Trackster options into a single menu button.
by Bitbucket 10 Jun '11
by Bitbucket 10 Jun '11
10 Jun '11
1 new changeset in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/7412c7af8859/
changeset: 7412c7af8859
user: jgoecks
date: 2011-06-10 19:38:24
summary: Add bookmarking to Trackster. Also group Trackster options into a single menu button.
affected #: 7 files (7.0 KB)
--- a/lib/galaxy/web/base/controller.py Fri Jun 10 13:15:17 2011 -0400
+++ b/lib/galaxy/web/base/controller.py Fri Jun 10 13:38:24 2011 -0400
@@ -160,6 +160,7 @@
if visualization.type == 'trackster':
# Trackster config; taken from tracks/browser
latest_revision = visualization.latest_revision
+ bookmarks = latest_revision.config.get( 'bookmarks', [] )
tracks = []
# Set tracks.
@@ -193,7 +194,7 @@
} )
config = { "title": visualization.title, "vis_id": trans.security.encode_id( visualization.id ),
- "tracks": tracks, "chrom": "", "dbkey": visualization.dbkey }
+ "tracks": tracks, "bookmarks": bookmarks, "chrom": "", "dbkey": visualization.dbkey }
if 'viewport' in latest_revision.config:
config['viewport'] = latest_revision.config['viewport']
--- a/lib/galaxy/web/controllers/tracks.py Fri Jun 10 13:15:17 2011 -0400
+++ b/lib/galaxy/web/controllers/tracks.py Fri Jun 10 13:38:24 2011 -0400
@@ -571,6 +571,8 @@
vis_rev.dbkey = dbkey
# Tracks from payload
tracks = []
+ # TODO: why go through the trouble of unpacking config only to repack and
+ # put in database? How about sticking JSON directly into database?
for track in decoded_payload['tracks']:
tracks.append( { "dataset_id": track['dataset_id'],
"hda_ldda": track.get('hda_ldda', "hda"),
@@ -579,14 +581,15 @@
"prefs": track['prefs'],
"is_child": track.get('is_child', False)
} )
+ bookmarks = decoded_payload[ 'bookmarks' ]
+ vis_rev.config = { "tracks": tracks, "bookmarks": bookmarks }
# Viewport from payload
if 'viewport' in decoded_payload:
chrom = decoded_payload['viewport']['chrom']
start = decoded_payload['viewport']['start']
end = decoded_payload['viewport']['end']
- vis_rev.config = { "tracks": tracks, "viewport": { 'chrom': chrom, 'start': start, 'end': end } }
- else:
- vis_rev.config = { "tracks": tracks }
+ vis_rev.config[ "viewport" ] = { 'chrom': chrom, 'start': start, 'end': end }
+
vis.latest_revision = vis_rev
session.add( vis_rev )
session.flush()
--- a/static/june_2007_style/blue/trackster.css Fri Jun 10 13:15:17 2011 -0400
+++ b/static/june_2007_style/blue/trackster.css Fri Jun 10 13:38:24 2011 -0400
@@ -48,3 +48,7 @@
.param-label{float:left;font-weight:bold;padding-top:0.2em;}
.child-track-icon{background:url('../images/fugue/arrow-000-small-bw.png') no-repeat;width:30px;cursor:move;}
.track-resize{background:white url('../images/visualization/draggable_vertical.png') no-repeat top center;position:absolute;right:3px;bottom:-4px;width:14px;height:7px;border:solid #999 1px;z-index:100;}
+.bookmark{background:white;border:solid #999 1px;border-right:none;margin:0.5em;margin-right:0;padding:0.5em}
+.bookmark .position{font-weight:bold;}
+.icon-button.import{margin-left:0.5em;width:100%;}
+.delete-icon-container{float:right;}
\ No newline at end of file
--- a/static/june_2007_style/trackster.css.tmpl Fri Jun 10 13:15:17 2011 -0400
+++ b/static/june_2007_style/trackster.css.tmpl Fri Jun 10 13:38:24 2011 -0400
@@ -278,4 +278,22 @@
border: solid #999 1px;
z-index: 100;
}
+.bookmark {
+ background:white;
+ border:solid #999 1px;
+ border-right:none;
+ margin:0.5em;
+ margin-right:0;
+ padding:0.5em
+}
+.bookmark .position {
+ font-weight:bold;
+}
+.icon-button.import {
+ margin-left:0.5em;
+ width:100%;
+}
+.delete-icon-container {
+ float:right;
+}
--- a/static/scripts/galaxy.base.js Fri Jun 10 13:15:17 2011 -0400
+++ b/static/scripts/galaxy.base.js Fri Jun 10 13:38:24 2011 -0400
@@ -330,7 +330,76 @@
});
}
-// Edit and save text asynchronously.
+/**
+ * Returns editable text element. Element is a div with text: (a) when user clicks on text, a textbox/area
+ * enables user to edit text; (b) when user presses enter key, element's text is set.
+ */
+// TODO: use this function to implement async_save_text (implemented below).
+function get_editable_text_elt(text, use_textarea, num_cols, num_rows, on_finish) {
+ // Set defaults if necessary.
+ if (num_cols === undefined) {
+ num_cols = 30;
+ }
+ if (num_rows === undefined) {
+ num_rows = 4;
+ }
+
+ // Create div for element.
+ var container = $("<div/>").addClass("editable-text").text(text).click(function() {
+ // If there's already an input element, editing is active, so do nothing.
+ if ($(this).children(":input").length > 0) {
+ return;
+ }
+
+ container.removeClass("editable-text");
+
+ // Set element text.
+ var set_text = function(new_text) {
+ if (new_text != "") {
+ container.text(new_text);
+ }
+ else {
+ // Need a line so that there is a click target.
+ container.html("<br>");
+ }
+ container.addClass("editable-text");
+ };
+
+ // Create input element for editing.
+ var cur_text = container.text(),
+ input_elt = (use_textarea ?
+ $("<textarea></textarea>").attr({ rows: num_rows, cols: num_cols }).text( $.trim(cur_text) ) :
+ $("<input type='text'></input>").attr({ value: $.trim(cur_text), size: num_cols })
+ ).blur(function() {
+ $(this).remove();
+ set_text(cur_text);
+ }).keyup(function(e) {
+ if (e.keyCode === 27) {
+ // Escape key.
+ $(this).trigger("blur");
+ } else if (e.keyCode === 13) {
+ // Enter key.
+ $(this).remove();
+ var new_text = $(this).val();
+ set_text(new_text);
+ if (on_finish) {
+ on_finish(new_text);
+ }
+ }
+ });
+
+ // Replace text with input object and focus & select.
+ container.text("");
+ container.append(input_elt);
+ input_elt.focus();
+ input_elt.select();
+ });
+ return container;
+}
+
+/**
+ * Edit and save text asynchronously.
+ */
function async_save_text(click_to_edit_elt, text_elt_id, save_url, text_parm_name, num_cols, use_textarea, num_rows, on_start, on_finish) {
// Set defaults if necessary.
if (num_cols === undefined) {
@@ -341,7 +410,7 @@
}
// Set up input element.
- $("#" + click_to_edit_elt).live( "click", function() {
+ $("#" + click_to_edit_elt).live("click", function() {
// Check if this is already active
if ( $("#renaming-active").length > 0) {
return;
@@ -399,7 +468,7 @@
}
// Replace text with input object and focus & select.
text_elt.hide();
- t.insertAfter( text_elt );
+ t.insertAfter(text_elt);
t.focus();
t.select();
--- a/templates/tracks/browser.mako Fri Jun 10 13:15:17 2011 -0400
+++ b/templates/tracks/browser.mako Fri Jun 10 13:38:24 2011 -0400
@@ -3,7 +3,7 @@
<%def name="init()"><%
self.has_left_panel=False
- self.has_right_panel=False
+ self.has_right_panel=True
self.active_view="visualization"
self.message_box_visible=False
%>
@@ -35,19 +35,6 @@
</style></%def>
-<%def name="center_panel()">
-<div class="unified-panel-header" unselectable="on">
- <div class="unified-panel-header-inner">
- <div style="float:left;" id="title"></div>
- <a class="panel-header-button right-float" href="${h.url_for( controller='visualization', action='list' )}">Close</a>
- <a id="save-button" class="panel-header-button right-float" href="javascript:void(0);">Save</a>
- <a id="add-track" class="panel-header-button right-float" href="javascript:void(0);">Add Tracks</a>
- </div>
-</div>
-<div id="browser-container" class="unified-panel-body"></div>
-
-</%def>
-
<%def name="javascripts()">
${parent.javascripts()}
@@ -72,8 +59,36 @@
converted_datasets_state_url = "${h.url_for( action='converted_datasets_state' )}",
addable_track_types = { "LineTrack": LineTrack, "FeatureTrack": FeatureTrack, "ReadTrack": ReadTrack },
view;
+
+
+ /**
+ * Add bookmark.
+ */
+ var add_bookmark = function(position, annotation) {
+ var
+ bookmarks_container = $("#bookmarks-container"),
+ new_bookmark = $("<div/>").addClass("bookmark").appendTo(bookmarks_container),
+ delete_icon_container = $("<div/>").addClass("delete-icon-container").appendTo(new_bookmark).click(function (){
+ // Remove bookmark.
+ new_bookmark.slideUp("fast");
+ new_bookmark.remove();
+ view.has_changes = true;
+ return false;
+ }),
+ delete_icon = $("<a href=''/>").addClass("icon-button delete").appendTo(delete_icon_container),
+ position_div = $("<div/>").addClass("position").text(position).appendTo(new_bookmark),
+ annotation_div = get_editable_text_elt(annotation, false).addClass("annotation").appendTo(new_bookmark);
+
+ view.has_changes = true;
+ return new_bookmark;
+ }
$(function() {
+ // Hide bookmarks by default right now.
+ parent.force_right_panel("hide");
+
+ // Resize view when showing/hiding right panel (bookmarks for now).
+ $("#right-border").click(function() { view.resize_window(); });
%if config:
var callback;
@@ -107,6 +122,17 @@
parent_obj.add_track(track);
}
init();
+
+ // Load bookmarks.
+ var bookmarks = JSON.parse('${ h.to_json_string( config.get('bookmarks') ) }'),
+ bookmark;
+ for (var i = 0; i < bookmarks.length; i++) {
+ bookmark = bookmarks[i];
+ add_bookmark(bookmark['position'], bookmark['annotation']);
+ }
+
+ // View has no changes as of yet.
+ view.has_changes = false;
%else:
var continue_fn = function() {
view = new View( $("#browser-container"), $("#new-title").val(), undefined, $("#new-dbkey").val() );
@@ -149,13 +175,13 @@
track_data.prefs, track_data.filters, track_data.tool );
view.add_track(new_track);
// Should replace with live event but can't get working
- sortable( new_track.container_div, ".draghandle" );
+ sortable(new_track.container_div, ".draghandle");
view.has_changes = true;
$("#no-tracks").hide();
};
%if add_dataset is not None:
- $.ajax( {
+ $.ajax({
url: "${h.url_for( action='add_track_async' )}",
data: { hda_id: "${add_dataset}" },
dataType: "json",
@@ -164,105 +190,132 @@
%endif
- // Use a popup grid to add more tracks
- $("#add-track").bind("click", function(e) {
- $.ajax({
- url: "${h.url_for( action='list_histories' )}",
- data: { "f-dbkey": view.dbkey },
- error: function() { alert( "Grid failed" ); },
- success: function(table_html) {
- show_modal(
- "Select datasets for new tracks",
- table_html, {
- "Cancel": function() {
- hide_modal();
- },
- "Insert": function() {
- var requests = [];
- $('input[name=id]:checked,input[name=ldda_ids]:checked').each(function() {
- var data,
- id = $(this).val();
- if ($(this).attr("name") === "id") {
- data = { hda_id: id };
- } else {
- data = { ldda_id: id};
- }
- requests[requests.length] = $.ajax({
- url: "${h.url_for( action='add_track_async' )}",
- data: data,
- dataType: "json",
- });
- });
- // To preserve order, wait until there are definitions for all tracks and then add
- // them sequentially.
- $.when.apply($, requests).then(function() {
- // jQuery always returns an Array for arguments, so need to look at first element
- // to determine whether multiple requests were made and consequently how to
- // map arguments to track definitions.
- var track_defs = (arguments[0] instanceof Array ?
- $.map(arguments, function(arg) { return arg[0]; }) :
- [ arguments[0] ]
- );
- for (var i= 0; i < track_defs.length; i++) {
- add_async_success(track_defs[i]);
- }
- });
- hide_modal();
+ $("#viz-options-button").css( "position", "relative" );
+ make_popupmenu( $("#viz-options-button"), {
+ "Add Tracks": function() {
+ // Use a popup grid to add more tracks
+ $.ajax({
+ url: "${h.url_for( action='list_histories' )}",
+ data: { "f-dbkey": view.dbkey },
+ error: function() { alert( "Grid failed" ); },
+ success: function(table_html) {
+ show_modal(
+ "Select datasets for new tracks",
+ table_html, {
+ "Cancel": function() {
+ hide_modal();
+ },
+ "Insert": function() {
+ var requests = [];
+ $('input[name=id]:checked,input[name=ldda_ids]:checked').each(function() {
+ var data,
+ id = $(this).val();
+ if ($(this).attr("name") === "id") {
+ data = { hda_id: id };
+ } else {
+ data = { ldda_id: id};
+ }
+ requests[requests.length] = $.ajax({
+ url: "${h.url_for( action='add_track_async' )}",
+ data: data,
+ dataType: "json",
+ });
+ });
+ // To preserve order, wait until there are definitions for all tracks and then add
+ // them sequentially.
+ $.when.apply($, requests).then(function() {
+ // jQuery always returns an Array for arguments, so need to look at first element
+ // to determine whether multiple requests were made and consequently how to
+ // map arguments to track definitions.
+ var track_defs = (arguments[0] instanceof Array ?
+ $.map(arguments, function(arg) { return arg[0]; }) :
+ [ arguments[0] ]
+ );
+ for (var i= 0; i < track_defs.length; i++) {
+ add_async_success(track_defs[i]);
+ }
+ });
+ hide_modal();
+ }
}
- }
- );
- }
- });
+ );
+ }
+ });
+ },
+ "Save": function() {
+ // Show saving dialog box
+ show_modal("Saving...", "<img src='${h.url_for('/static/images/yui/rel_interstitial_loading.gif')}'/>");
+
+ // Save all tracks.
+ var tracks = [];
+ $(".viewport-container .track").each(function () {
+ // ID has form track_<main_track_id>_<child_track_id>
+ var
+ id_split = $(this).attr("id").split("_"),
+ track_id = id_split[1],
+ child_id = id_split[2];
+
+ // Get track.
+ var track = view.tracks[track_id];
+ if (child_id) {
+ track = track.child_tracks[child_id];
+ }
+
+ // Add track.
+ tracks.push( {
+ "track_type": track.track_type,
+ "name": track.name,
+ "hda_ldda": track.hda_ldda,
+ "dataset_id": track.dataset_id,
+ "prefs": track.prefs,
+ "is_child": (child_id ? true : false )
+ });
+ });
+
+ // Save all bookmarks.
+ var bookmarks = [];
+ $(".bookmark").each(function() {
+ bookmarks[bookmarks.length] = {
+ position: $(this).children(".position").text(),
+ annotation: $(this).children(".annotation").text()
+ };
+ });
+
+ var payload = {
+ 'tracks': tracks,
+ 'viewport': { 'chrom': view.chrom, 'start': view.low , 'end': view.high },
+ 'bookmarks': bookmarks
+ };
+
+ $.ajax({
+ url: "${h.url_for( action='save' )}",
+ type: "POST",
+ data: {
+ 'vis_id': view.vis_id,
+ 'vis_title': view.title,
+ 'dbkey': view.dbkey,
+ 'payload': JSON.stringify(payload)
+ },
+ success: function(vis_id) {
+ view.vis_id = vis_id;
+ view.has_changes = false;
+ hide_modal();
+ },
+ error: function() { alert("Could not save visualization"); }
+ });
+ },
+ "Bookmarks": function() {
+ // HACK -- use style to determine if panel is hidden and hide/show accordingly.
+ parent.force_right_panel(($("div#right").css("right") == "0px" ? "hide" : "show"));
+ },
+ "Close": function() { window.location = "${h.url_for( controller='visualization', action='list' )}"; }
});
- $("#save-button").bind("click", function(e) {
- // Show saving dialog box
- show_modal("Saving...", "<img src='${h.url_for('/static/images/yui/rel_interstitial_loading.gif')}'/>");
-
- // Save all tracks.
- var tracks = [];
- $(".viewport-container .track").each(function () {
- // ID has form track_<main_track_id>_<child_track_id>
- var
- id_split = $(this).attr("id").split("_"),
- track_id = id_split[1],
- child_id = id_split[2];
-
- // Get track.
- var track = view.tracks[track_id];
- if (child_id) {
- track = track.child_tracks[child_id];
- }
-
- // Add track.
- tracks.push( {
- "track_type": track.track_type,
- "name": track.name,
- "hda_ldda": track.hda_ldda,
- "dataset_id": track.dataset_id,
- "prefs": track.prefs,
- "is_child": (child_id ? true : false )
- });
- });
-
- var payload = { 'tracks': tracks, 'viewport': { 'chrom': view.chrom, 'start': view.low , 'end': view.high } };
-
- $.ajax({
- url: "${h.url_for( action='save' )}",
- type: "POST",
- data: {
- 'vis_id': view.vis_id,
- 'vis_title': view.title,
- 'dbkey': view.dbkey,
- 'payload': JSON.stringify(payload)
- },
- success: function(vis_id) {
- view.vis_id = vis_id;
- view.has_changes = false;
- hide_modal();
- },
- error: function() { alert("Could not save visualization"); }
- });
+ $("#add-bookmark-button").click(function() {
+ // Add new bookmark.
+ var position = view.chrom + ":" + view.low + "-" + view.high,
+ annotation = "Bookmark description";
+ return add_bookmark(position, annotation);
});
//
@@ -298,3 +351,32 @@
</script></%def>
+
+<%def name="center_panel()">
+<div class="unified-panel-header" unselectable="on">
+ <div class="unified-panel-header-inner">
+ <div style="float:left;" id="title"></div>
+ <div style="float: right">
+ <a id="viz-options-button" class='panel-header-button popup' href="javascript:void(0)" target="galaxy_main">${_('Options')}</a>
+ </div>
+ </div>
+</div>
+<div id="browser-container" class="unified-panel-body"></div>
+
+</%def>
+
+<%def name="right_panel()">
+
+<div class="unified-panel-header" unselectable="on">
+ <div class="unified-panel-header-inner">
+ Bookmarks
+ </div>
+</div>
+<div class="unified-panel-body" style="overflow: auto;">
+ <div id="bookmarks-container"></div>
+ <div>
+ <a class="icon-button import" original-title="Add Bookmark" id="add-bookmark-button" href="javascript:void(0);">Add Bookmark</a>
+ </div>
+</div>
+
+</%def>
--- a/templates/visualization/display.mako Fri Jun 10 13:15:17 2011 -0400
+++ b/templates/visualization/display.mako Fri Jun 10 13:38:24 2011 -0400
@@ -78,6 +78,7 @@
if (container_element.parents(".item-content").length > 0) { // Embedded viz
container_element.parents(".item-content").css( { "max-height": "none", "overflow": "visible" } );
} else { // Viewing just one shared viz
+ // TODO: need live or just bind click?
$("#right-border").live("click", function() { view.resize_window(); });
}
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
6 new changesets in galaxy-central:
http://bitbucket.org/galaxy/galaxy-central/changeset/01bd5792fbc7/
changeset: 01bd5792fbc7
user: fangly
date: 2010-12-16 09:26:59
summary: Added 2 Python scripts to deal with FASTQ mate pairs:
- the interlacer puts mate pairs present in 2 files into a single file
- the deinterlacer puts mate pairs present in a single file into 2 files
affected #: 3 files (3.0 KB)
--- a/lib/galaxy_utils/sequence/fastq.py Fri Jun 10 12:36:16 2011 -0400
+++ b/lib/galaxy_utils/sequence/fastq.py Thu Dec 16 18:26:59 2010 +1000
@@ -609,12 +609,15 @@
return rval
def get_paired_identifier( self, fastq_read ):
identifier = fastq_read.identifier
+ identifier_is_first = None
if identifier[-2] == '/':
if identifier[-1] == "1":
identifier = "%s2" % identifier[:-1]
+ identifier_is_first = False
elif identifier[-1] == "2":
identifier = "%s1" % identifier[:-1]
- return identifier
+ identifier_is_first = True
+ return identifier, identifier_is_first
class fastqSplitter( object ):
def split( self, fastq_read ):
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/fastq/fastq_paired_end_deinterlacer.py Thu Dec 16 18:26:59 2010 +1000
@@ -0,0 +1,53 @@
+#Florent Angly
+import sys
+from galaxy_utils.sequence.fastq import fastqReader, fastqWriter, fastqNamedReader, fastqJoiner
+
+def main():
+ input_filename = sys.argv[1]
+ input_type = sys.argv[2] or 'sanger'
+ mate1_filename = sys.argv[3]
+ mate2_filename = sys.argv[4]
+
+ type = input_type
+ input = fastqNamedReader( open( input_filename, 'rb' ), format = type )
+ out1 = fastqWriter( open( mate1_filename, 'wb' ), format = type )
+ out2 = fastqWriter( open( mate2_filename, 'wb' ), format = type )
+ joiner = fastqJoiner( type )
+
+ i = None
+ skip_count = 0
+ found = {}
+ for i, mate1 in enumerate( fastqReader( open( input_filename, 'rb' ), format = type ) ):
+
+ if mate1.identifier in found:
+ del found[mate1.identifier]
+ continue
+
+ mate2_id, mate2_is_first = joiner.get_paired_identifier( mate1 )
+
+ mate2 = input.get( mate2_id )
+ if mate2:
+ found[mate2_id] = None
+ if mate2_is_first:
+ out1.write( mate2 )
+ out2.write( mate1 )
+ else:
+ out1.write( mate1 )
+ out2.write( mate2 )
+ else:
+ skip_count += 1
+
+ if i is None:
+ print "Your input file contained no valid FASTQ sequences."
+ else:
+ if skip_count:
+ print '%i reads had no mate.' % skip_count
+ print 'De-interlaced %s pairs of sequences.' % ( (i - skip_count + 1)/2 )
+
+ input.close()
+ out1.close()
+ out2.close()
+
+
+if __name__ == "__main__":
+ main()
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/fastq/fastq_paired_end_interlacer.py Thu Dec 16 18:26:59 2010 +1000
@@ -0,0 +1,46 @@
+#Florent Angly
+import sys
+from galaxy_utils.sequence.fastq import fastqReader, fastqWriter, fastqNamedReader, fastqJoiner
+
+def main():
+ mate1_filename = sys.argv[1]
+ mate1_type = sys.argv[2] or 'sanger'
+ mate2_filename = sys.argv[3]
+ mate2_type = sys.argv[4] or 'sanger'
+ output_filename = sys.argv[5]
+
+ if mate1_type != mate2_type:
+ print "WARNING: You are trying to interlace files of two different types: %s and %s." % ( mate1_type, mate2_type )
+ return
+
+ type = mate1_type
+ joiner = fastqJoiner( type )
+ out = fastqWriter( open( output_filename, 'wb' ), format = type )
+ mate_input = fastqNamedReader( open( mate2_filename, 'rb' ), format = type )
+
+ i = None
+ skip_count = 0
+ for i, mate1 in enumerate( fastqReader( open( mate1_filename, 'rb' ), format = type ) ):
+
+ mate2 = mate_input.get( joiner.get_paired_identifier( mate1 ) )
+
+ if mate2:
+ out.write( mate1 )
+ out.write( mate2 )
+ else:
+ skip_count += 1
+
+ if i is None:
+ print "Your input file contained no valid FASTQ sequences."
+ else:
+ not_used_msg = mate_input.has_data()
+ if not_used_msg:
+ print not_used_msg
+ print 'Interlaced %s pairs of sequences.' % ( i - skip_count + 1 )
+
+ mate_input.close()
+ out.close()
+
+
+if __name__ == "__main__":
+ main()
http://bitbucket.org/galaxy/galaxy-central/changeset/8fe0ba2e1910/
changeset: 8fe0ba2e1910
user: fangly
date: 2010-12-16 08:12:01
summary: Little bug fix and more informative error message
affected #: 1 file (26 bytes)
--- a/lib/galaxy_utils/sequence/fastq.py Thu Dec 16 18:26:59 2010 +1000
+++ b/lib/galaxy_utils/sequence/fastq.py Thu Dec 16 17:12:01 2010 +1000
@@ -438,7 +438,7 @@
while True:
line = self.file.readline()
if not line:
- raise Exception( 'Invalid FASTQ file: could not parse second instance of sequence identifier.' )
+ raise Exception( 'Invalid FASTQ file: could not find quality score of sequence identifier %s.' % rval.identifier )
line = line.rstrip( '\n\r' )
if line.startswith( '+' ) and ( len( line ) == 1 or line[1:].startswith( fastq_header[1:] ) ):
rval.description = line
@@ -547,7 +547,7 @@
eof = True
self.file.seek( offset )
if count:
- rval = "There were %i known sequence reads not utilized. "
+ rval = "There were %i known sequence reads not utilized. " % count
if not eof:
rval = "%s%s" % ( rval, "An additional unknown number of reads exist in the input that were not utilized." )
return rval
http://bitbucket.org/galaxy/galaxy-central/changeset/eebd5ac107c3/
changeset: eebd5ac107c3
user: fangly
date: 2010-12-17 06:04:17
summary: FASTQ interlacer and de-interlacer tools fully integrated in Galaxy and functional
affected #: 15 files (17.2 KB)
--- a/lib/galaxy_utils/sequence/fastq.py Thu Dec 16 17:12:01 2010 +1000
+++ b/lib/galaxy_utils/sequence/fastq.py Fri Dec 17 15:04:17 2010 +1000
@@ -609,15 +609,22 @@
return rval
def get_paired_identifier( self, fastq_read ):
identifier = fastq_read.identifier
- identifier_is_first = None
if identifier[-2] == '/':
if identifier[-1] == "1":
identifier = "%s2" % identifier[:-1]
- identifier_is_first = False
elif identifier[-1] == "2":
identifier = "%s1" % identifier[:-1]
- identifier_is_first = True
- return identifier, identifier_is_first
+ return identifier
+ def is_first_mate( self, sequence_id ):
+ is_first = None
+ if not isinstance( sequence_id, basestring ):
+ sequence_id = sequence_id.identifier
+ if sequence_id[-2] == '/':
+ if sequence_id[-1] == "1":
+ is_first = True
+ else:
+ is_first = False
+ return is_first
class fastqSplitter( object ):
def split( self, fastq_read ):
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_1.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,20 @@
+@1539:931/1
+NACATCAACACTCAGTAACGGCTGGCGCAAAATGGCATTGATTAACGAAGACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
++1539:931/1
+BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/1
+NCGGAGACTTCGAGGCCATCCAGTCGATTGCCAAAGTCATCAAGGGGTCGACGATCTGCTCCCTTGCCCGTTCCAACGAGAATGAAATCCGCCGCGCGTGG
++2971:937/1
+BMQMMRRRSS__________XXXXXVVVVV_b___Y[Y[YXVRVWWPWVX_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/1
+NTACCGCGCAACGGCATGATGGCTTGGAACTCACGGTCACGCGCCTGTTTGGCAGAGCCGCCCGCCGAGTCACCTTCCACTAGGAACAGTTCGGAGCGGTT
++3786:949/1
+BKGGKKJNJJ_______W__Y__W_TVPVP[YY[[_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/1
+NGATCTGGGCTTCAGCAAGACCGATGTCGGCGTGATTGCCAAGCATGCCGGACTCTGGCCGGCGGGGTTCGGCGGTGTGCTGGGTGGCTTGGGGGTGGGGG
++4205:944/1
+BLLLLTWTTR_V_______TYYYRYYYYYY____VWRWWW___BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/1
+NAATGCCGGTATTTGGCACGATGGCGGCACGCTTCCACGACGACGGGGTGACCTCTCTCTATCAGGCGATGGCATCCAAATTGCACGCGCGGGGTTTGAGG
++4534:934/1
+BGGFGLJLJL______________V____________________YYYPQOTWVT__________BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_1_cleaned.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,16 @@
+@2971:937/1
+NCGGAGACTTCGAGGCCATCCAGTCGATTGCCAAAGTCATCAAGGGGTCGACGATCTGCTCCCTTGCCCGTTCCAACGAGAATGAAATCCGCCGCGCGTGG
++2971:937/1
+BMQMMRRRSS__________XXXXXVVVVV_b___Y[Y[YXVRVWWPWVX_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/1
+NTACCGCGCAACGGCATGATGGCTTGGAACTCACGGTCACGCGCCTGTTTGGCAGAGCCGCCCGCCGAGTCACCTTCCACTAGGAACAGTTCGGAGCGGTT
++3786:949/1
+BKGGKKJNJJ_______W__Y__W_TVPVP[YY[[_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/1
+NGATCTGGGCTTCAGCAAGACCGATGTCGGCGTGATTGCCAAGCATGCCGGACTCTGGCCGGCGGGGTTCGGCGGTGTGCTGGGTGGCTTGGGGGTGGGGG
++4205:944/1
+BLLLLTWTTR_V_______TYYYRYYYYYY____VWRWWW___BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/1
+NAATGCCGGTATTTGGCACGATGGCGGCACGCTTCCACGACGACGGGGTGACCTCTCTCTATCAGGCGATGGCATCCAAATTGCACGCGCGGGGTTTGAGG
++4534:934/1
+BGGFGLJLJL______________V____________________YYYPQOTWVT__________BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_1_errors.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,20 @@
+@1539:931/1
+NACATCAACACTCAGTAACGGCTGGCGCAAAATGGCATTGATTAACGAAGACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
++1539:931/1
+BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/1
+NCGGAGACTTCGAGGCCATCCAGTCGATTGCCAAAGTCATCAAGGGGTCGACGATCTGCTCCCTTGCCCGTTCCAACGAGAATGAAATCCGCCGCGCGTGG
++2971:937/1
+BMQMMRRRSS__________XXXXXVVVVV_b___Y[Y[YXVRVWWPWVX_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/1
+NTACCGCGCAACGGCATGATGGCTTGGAACTCACGGTCACGCGCCTGTTTGGCAGAGCCGCCCGCCGAGTCACCTTCCACTAGGAACAGTTCGGAGCGGTT
++3786:949/1
+BKGGKKJNJJ_______W__Y__W_TVPVP[YY[[_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@9999:944/1
+NGATCTGGGCTTCAGCAAGACCGATGTCGGCGTGATTGCCAAGCATGCCGGACTCTGGCCGGCGGGGTTCGGCGGTGTGCTGGGTGGCTTGGGGGTGGGGG
++9999:944/1
+BLLLLTWTTR_V_______TYYYRYYYYYY____VWRWWW___BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/1
+NAATGCCGGTATTTGGCACGATGGCGGCACGCTTCCACGACGACGGGGTGACCTCTCTCTATCAGGCGATGGCATCCAAATTGCACGCGCGGGGTTTGAGG
++4534:934/1
+BGGFGLJLJL______________V____________________YYYPQOTWVT__________BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_2.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,20 @@
+@1539:931/2
+GCGCGTAACGTTTCACCTCGAGATCGTTGTCGGCCGCAATCTCCTGGGGGCGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
++1539:931/2
+aee_dcadeeWcaaadJbdaff[fffc]dcfe[dRc^\[^QVOZXXZSPFWNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/2
+CTCGCACGGCCGCCTCGACCACTTGGTCTGGCGTCATGCGCAATTTTTTCTCCATGTGGAACGGGCTGGTGGCGATGAACGTATGAATATGCCCCCGCGCT
++2971:937/2
+hhhddhefhh_ffffhhhhfah_hhdUdcfW`fbbhfcaec_dfdbba````W^caaaJXGKXSUVYVZY^WY^BBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/2
+CTCAACCAGAACACCGTGATCGGCGACCAGTTGGCGCAGTTCGCCATCAGAAATGCAGGGATGCGGATGCGGGCTAGCACGAAAGTCATCCTCAACACGAT
++3786:949/2
+ffcfcaffff\_edefddff[ffa_fRffRdc]Sdf]affehh_eaebBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/2
+GTCGACAGGTGCCTGTACACCACGCCAGGCCAGCCAGGCGAAACCGAGAACGGTCACCATCTGAACCAGACCGAAAACCAACAGTGCGGGGTTGAGCCACG
++4205:944/2
+hhhhhcffcWcdfdcffdffdfQadf[fLfc`Ra`Wcca]`^``]L[^QZGSQWUYZXK[`bJRbZb[_BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/2
+GGTAATTGCGGACGGCTTCGGCAATTTCGGCCAGGTAGCGCACGCGCTTCGACGGAACGATGGCGCGCAGGTTCGACGATTGTCGAACGCTGATCAGCGCG
++4534:934/2
+ffffcff[fdhaghh[ffcahhghhhdhadhhhhg_hc[hf]fec]faa]bLb___`^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_2_cleaned.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,16 @@
+@2971:937/2
+CTCGCACGGCCGCCTCGACCACTTGGTCTGGCGTCATGCGCAATTTTTTCTCCATGTGGAACGGGCTGGTGGCGATGAACGTATGAATATGCCCCCGCGCT
++2971:937/2
+hhhddhefhh_ffffhhhhfah_hhdUdcfW`fbbhfcaec_dfdbba````W^caaaJXGKXSUVYVZY^WY^BBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/2
+CTCAACCAGAACACCGTGATCGGCGACCAGTTGGCGCAGTTCGCCATCAGAAATGCAGGGATGCGGATGCGGGCTAGCACGAAAGTCATCCTCAACACGAT
++3786:949/2
+ffcfcaffff\_edefddff[ffa_fRffRdc]Sdf]affehh_eaebBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/2
+GTCGACAGGTGCCTGTACACCACGCCAGGCCAGCCAGGCGAAACCGAGAACGGTCACCATCTGAACCAGACCGAAAACCAACAGTGCGGGGTTGAGCCACG
++4205:944/2
+hhhhhcffcWcdfdcffdffdfQadf[fLfc`Ra`Wcca]`^``]L[^QZGSQWUYZXK[`bJRbZb[_BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/2
+GGTAATTGCGGACGGCTTCGGCAATTTCGGCCAGGTAGCGCACGCGCTTCGACGGAACGATGGCGCGCAGGTTCGACGATTGTCGAACGCTGATCAGCGCG
++4534:934/2
+ffffcff[fdhaghh[ffcahhghhhdhadhhhhg_hc[hf]fec]faa]bLb___`^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_2_errors.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,20 @@
+@1539:931/2
+GCGCGTAACGTTTCACCTCGAGATCGTTGTCGGCCGCAATCTCCTGGGGGCGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
++1539:931/2
+aee_dcadeeWcaaadJbdaff[fffc]dcfe[dRc^\[^QVOZXXZSPFWNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/2
+CTCGCACGGCCGCCTCGACCACTTGGTCTGGCGTCATGCGCAATTTTTTCTCCATGTGGAACGGGCTGGTGGCGATGAACGTATGAATATGCCCCCGCGCT
++2971:937/2
+hhhddhefhh_ffffhhhhfah_hhdUdcfW`fbbhfcaec_dfdbba````W^caaaJXGKXSUVYVZY^WY^BBBBBBBBBBBBBBBBBBBBBBBBBBB
+@9999:949/2
+CTCAACCAGAACACCGTGATCGGCGACCAGTTGGCGCAGTTCGCCATCAGAAATGCAGGGATGCGGATGCGGGCTAGCACGAAAGTCATCCTCAACACGAT
++9999:949/2
+ffcfcaffff\_edefddff[ffa_fRffRdc]Sdf]affehh_eaebBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/2
+GTCGACAGGTGCCTGTACACCACGCCAGGCCAGCCAGGCGAAACCGAGAACGGTCACCATCTGAACCAGACCGAAAACCAACAGTGCGGGGTTGAGCCACG
++4205:944/2
+hhhhhcffcWcdfdcffdffdfQadf[fLfc`Ra`Wcca]`^``]L[^QZGSQWUYZXK[`bJRbZb[_BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/2
+GGTAATTGCGGACGGCTTCGGCAATTTCGGCCAGGTAGCGCACGCGCTTCGACGGAACGATGGCGCGCAGGTTCGACGATTGTCGAACGCTGATCAGCGCG
++4534:934/2
+ffffcff[fdhaghh[ffcahhghhhdhadhhhhg_hc[hf]fec]faa]bLb___`^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_merged.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,40 @@
+@1539:931/1
+NACATCAACACTCAGTAACGGCTGGCGCAAAATGGCATTGATTAACGAAGACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
++1539:931/1
+BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@1539:931/2
+GCGCGTAACGTTTCACCTCGAGATCGTTGTCGGCCGCAATCTCCTGGGGGCGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
++1539:931/2
+aee_dcadeeWcaaadJbdaff[fffc]dcfe[dRc^\[^QVOZXXZSPFWNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/1
+NCGGAGACTTCGAGGCCATCCAGTCGATTGCCAAAGTCATCAAGGGGTCGACGATCTGCTCCCTTGCCCGTTCCAACGAGAATGAAATCCGCCGCGCGTGG
++2971:937/1
+BMQMMRRRSS__________XXXXXVVVVV_b___Y[Y[YXVRVWWPWVX_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/2
+CTCGCACGGCCGCCTCGACCACTTGGTCTGGCGTCATGCGCAATTTTTTCTCCATGTGGAACGGGCTGGTGGCGATGAACGTATGAATATGCCCCCGCGCT
++2971:937/2
+hhhddhefhh_ffffhhhhfah_hhdUdcfW`fbbhfcaec_dfdbba````W^caaaJXGKXSUVYVZY^WY^BBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/1
+NTACCGCGCAACGGCATGATGGCTTGGAACTCACGGTCACGCGCCTGTTTGGCAGAGCCGCCCGCCGAGTCACCTTCCACTAGGAACAGTTCGGAGCGGTT
++3786:949/1
+BKGGKKJNJJ_______W__Y__W_TVPVP[YY[[_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/2
+CTCAACCAGAACACCGTGATCGGCGACCAGTTGGCGCAGTTCGCCATCAGAAATGCAGGGATGCGGATGCGGGCTAGCACGAAAGTCATCCTCAACACGAT
++3786:949/2
+ffcfcaffff\_edefddff[ffa_fRffRdc]Sdf]affehh_eaebBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/1
+NGATCTGGGCTTCAGCAAGACCGATGTCGGCGTGATTGCCAAGCATGCCGGACTCTGGCCGGCGGGGTTCGGCGGTGTGCTGGGTGGCTTGGGGGTGGGGG
++4205:944/1
+BLLLLTWTTR_V_______TYYYRYYYYYY____VWRWWW___BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/2
+GTCGACAGGTGCCTGTACACCACGCCAGGCCAGCCAGGCGAAACCGAGAACGGTCACCATCTGAACCAGACCGAAAACCAACAGTGCGGGGTTGAGCCACG
++4205:944/2
+hhhhhcffcWcdfdcffdffdfQadf[fLfc`Ra`Wcca]`^``]L[^QZGSQWUYZXK[`bJRbZb[_BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/1
+NAATGCCGGTATTTGGCACGATGGCGGCACGCTTCCACGACGACGGGGTGACCTCTCTCTATCAGGCGATGGCATCCAAATTGCACGCGCGGGGTTTGAGG
++4534:934/1
+BGGFGLJLJL______________V____________________YYYPQOTWVT__________BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/2
+GGTAATTGCGGACGGCTTCGGCAATTTCGGCCAGGTAGCGCACGCGCTTCGACGGAACGATGGCGCGCAGGTTCGACGATTGTCGAACGCTGATCAGCGCG
++4534:934/2
+ffffcff[fdhaghh[ffcahhghhhdhadhhhhg_hc[hf]fec]faa]bLb___`^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_merged_cleaned.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,24 @@
+@1539:931/1
+NACATCAACACTCAGTAACGGCTGGCGCAAAATGGCATTGATTAACGAAGACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
++1539:931/1
+BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@1539:931/2
+GCGCGTAACGTTTCACCTCGAGATCGTTGTCGGCCGCAATCTCCTGGGGGCGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
++1539:931/2
+aee_dcadeeWcaaadJbdaff[fffc]dcfe[dRc^\[^QVOZXXZSPFWNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/1
+NCGGAGACTTCGAGGCCATCCAGTCGATTGCCAAAGTCATCAAGGGGTCGACGATCTGCTCCCTTGCCCGTTCCAACGAGAATGAAATCCGCCGCGCGTGG
++2971:937/1
+BMQMMRRRSS__________XXXXXVVVVV_b___Y[Y[YXVRVWWPWVX_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/2
+CTCGCACGGCCGCCTCGACCACTTGGTCTGGCGTCATGCGCAATTTTTTCTCCATGTGGAACGGGCTGGTGGCGATGAACGTATGAATATGCCCCCGCGCT
++2971:937/2
+hhhddhefhh_ffffhhhhfah_hhdUdcfW`fbbhfcaec_dfdbba````W^caaaJXGKXSUVYVZY^WY^BBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/1
+NAATGCCGGTATTTGGCACGATGGCGGCACGCTTCCACGACGACGGGGTGACCTCTCTCTATCAGGCGATGGCATCCAAATTGCACGCGCGGGGTTTGAGG
++4534:934/1
+BGGFGLJLJL______________V____________________YYYPQOTWVT__________BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/2
+GGTAATTGCGGACGGCTTCGGCAATTTCGGCCAGGTAGCGCACGCGCTTCGACGGAACGATGGCGCGCAGGTTCGACGATTGTCGAACGCTGATCAGCGCG
++4534:934/2
+ffffcff[fdhaghh[ffcahhghhhdhadhhhhg_hc[hf]fec]faa]bLb___`^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_merged_errors.fastqsanger Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,40 @@
+@1539:931/1
+NACATCAACACTCAGTAACGGCTGGCGCAAAATGGCATTGATTAACGAAGACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
++1539:931/1
+BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@9999:931/2
+GCGCGTAACGTTTCACCTCGAGATCGTTGTCGGCCGCAATCTCCTGGGGGCGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
++9999:931/2
+aee_dcadeeWcaaadJbdaff[fffc]dcfe[dRc^\[^QVOZXXZSPFWNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/2
+CTCGCACGGCCGCCTCGACCACTTGGTCTGGCGTCATGCGCAATTTTTTCTCCATGTGGAACGGGCTGGTGGCGATGAACGTATGAATATGCCCCCGCGCT
++2971:937/2
+hhhddhefhh_ffffhhhhfah_hhdUdcfW`fbbhfcaec_dfdbba````W^caaaJXGKXSUVYVZY^WY^BBBBBBBBBBBBBBBBBBBBBBBBBBB
+@2971:937/1
+NCGGAGACTTCGAGGCCATCCAGTCGATTGCCAAAGTCATCAAGGGGTCGACGATCTGCTCCCTTGCCCGTTCCAACGAGAATGAAATCCGCCGCGCGTGG
++2971:937/1
+BMQMMRRRSS__________XXXXXVVVVV_b___Y[Y[YXVRVWWPWVX_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/1
+NTACCGCGCAACGGCATGATGGCTTGGAACTCACGGTCACGCGCCTGTTTGGCAGAGCCGCCCGCCGAGTCACCTTCCACTAGGAACAGTTCGGAGCGGTT
++3786:949/1
+BKGGKKJNJJ_______W__Y__W_TVPVP[YY[[_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@3786:949/2
+CTCAACCAGAACACCGTGATCGGCGACCAGTTGGCGCAGTTCGCCATCAGAAATGCAGGGATGCGGATGCGGGCTAGCACGAAAGTCATCCTCAACACGAT
++3786:949/2
+ffcfcaffff\_edefddff[ffa_fRffRdc]Sdf]affehh_eaebBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/1
+NGATCTGGGCTTCAGCAAGACCGATGTCGGCGTGATTGCCAAGCATGCCGGACTCTGGCCGGCGGGGTTCGGCGGTGTGCTGGGTGGCTTGGGGGTGGGGG
++4205:944/1
+BLLLLTWTTR_V_______TYYYRYYYYYY____VWRWWW___BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/2
+GTCGACAGGTGCCTGTACACCACGCCAGGCCAGCCAGGCGAAACCGAGAACGGTCACCATCTGAACCAGACCGAAAACCAACAGTGCGGGGTTGAGCCACG
++4205:944/2
+hhhhhcffcWcdfdcffdffdfQadf[fLfc`Ra`Wcca]`^``]L[^QZGSQWUYZXK[`bJRbZb[_BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/1
+NAATGCCGGTATTTGGCACGATGGCGGCACGCTTCCACGACGACGGGGTGACCTCTCTCTATCAGGCGATGGCATCCAAATTGCACGCGCGGGGTTTGAGG
++4534:934/1
+BGGFGLJLJL______________V____________________YYYPQOTWVT__________BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4534:934/2
+GGTAATTGCGGACGGCTTCGGCAATTTCGGCCAGGTAGCGCACGCGCTTCGACGGAACGATGGCGCGCAGGTTCGACGATTGTCGAACGCTGATCAGCGCG
++4534:934/2
+ffffcff[fdhaghh[ffcahhghhhdhadhhhhg_hc[hf]fec]faa]bLb___`^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- a/tool_conf.xml.sample Thu Dec 16 17:12:01 2010 +1000
+++ b/tool_conf.xml.sample Fri Dec 17 15:04:17 2010 +1000
@@ -260,6 +260,8 @@
<tool file="fastq/fastq_trimmer.xml" /><tool file="fastq/fastq_trimmer_by_quality.xml" /><tool file="fastq/fastq_masker_by_quality.xml" />
+ <tool file="fastq/fastq_paired_end_interlacer.xml" />
+ <tool file="fastq/fastq_paired_end_deinterlacer.xml" /><tool file="fastq/fastq_manipulation.xml" /><tool file="fastq/fastq_to_fasta.xml" /><tool file="fastq/fastq_to_tabular.xml" />
--- a/tools/fastq/fastq_paired_end_deinterlacer.py Thu Dec 16 17:12:01 2010 +1000
+++ b/tools/fastq/fastq_paired_end_deinterlacer.py Fri Dec 17 15:04:17 2010 +1000
@@ -23,17 +23,16 @@
del found[mate1.identifier]
continue
- mate2_id, mate2_is_first = joiner.get_paired_identifier( mate1 )
+ mate2 = input.get( joiner.get_paired_identifier( mate1 ) )
- mate2 = input.get( mate2_id )
if mate2:
- found[mate2_id] = None
- if mate2_is_first:
+ found[mate2.identifier] = None
+ if joiner.is_first_mate( mate1 ):
+ out1.write( mate1 )
+ out2.write( mate2 )
+ else:
out1.write( mate2 )
out2.write( mate1 )
- else:
- out1.write( mate1 )
- out2.write( mate2 )
else:
skip_count += 1
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/fastq/fastq_paired_end_deinterlacer.xml Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,64 @@
+<tool id="fastq_paired_end_deinterlacer" name="FASTQ de-interlacer" version="1.0.0">
+ <description>on paired end reads</description>
+ <command interpreter="python">fastq_paired_end_deinterlacer.py '$input1_file' '${input1_file.extension[len( 'fastq' ):]}' '$output1_file' '$output2_file'</command>
+ <inputs>
+ <param name="input1_file" type="data" format="fastqsanger,fastqcssanger" label="FASTQ reads" />
+ </inputs>
+ <outputs>
+ <data name="output1_file" format="input" />
+ <data name="output2_file" format="input" />
+ </outputs>
+ <tests>
+ <test>
+ <param name="input1_file" value="paired_end_merged.fastqsanger" ftype="fastqsanger" />
+ <output name="output1_file" file="paired_end_1.fastqsanger" />
+ <output name="output2_file" file="paired_end_2.fastqsanger" />
+ </test>
+ <test>
+ <param name="input1_file" value="paired_end_merged_errors.fastqsanger" ftype="fastqsanger" />
+ <output name="output1_file" file="paired_end_1_cleaned.fastqsanger" />
+ <output name="output2_file" file="paired_end_2_cleaned.fastqsanger" />
+ </test>
+ </tests>
+ <help>
+**What it does**
+
+De-interlaces a single fastq dataset representing paired-end run into two fastq datasets containing only the first or second mate read. Reads without mate are excluded from the output files.
+
+Sequence identifiers for paired-end reads must follow the /1 and /2 convention.
+
+-----
+
+**Input**
+
+A multiple-fastq file containing paired-end reads, for example::
+
+ @1539:931/1
+ ACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
+ +1539:931/1
+ BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+ @1539:931/2
+ CGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
+ +1539:931/2
+ WNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+
+-----
+
+**Output**
+
+Multi-fastq file with left-hand mate only::
+
+ @1539:931/1
+ ACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
+ +1539:931/1
+ BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+
+Multi-fastq file with right-hand mate only::
+
+ @1539:931/2
+ CGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
+ +1539:931/2
+ WNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+
+ </help>
+</tool>
--- a/tools/fastq/fastq_paired_end_interlacer.py Thu Dec 16 17:12:01 2010 +1000
+++ b/tools/fastq/fastq_paired_end_interlacer.py Fri Dec 17 15:04:17 2010 +1000
@@ -21,9 +21,7 @@
i = None
skip_count = 0
for i, mate1 in enumerate( fastqReader( open( mate1_filename, 'rb' ), format = type ) ):
-
mate2 = mate_input.get( joiner.get_paired_identifier( mate1 ) )
-
if mate2:
out.write( mate1 )
out.write( mate2 )
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/fastq/fastq_paired_end_interlacer.xml Fri Dec 17 15:04:17 2010 +1000
@@ -0,0 +1,64 @@
+<tool id="fastq_paired_end_interlacer" name="FASTQ interlacer" version="1.0.0">
+ <description>on paired end reads</description>
+ <command interpreter="python">fastq_paired_end_interlacer.py '$input1_file' '${input1_file.extension[len( 'fastq' ):]}' '$input2_file' '${input2_file.extension[len( 'fastq' ):]}' '$output_file'</command>
+ <inputs>
+ <param name="input1_file" type="data" format="fastqsanger,fastqcssanger" label="Left-hand mates" />
+ <param name="input2_file" type="data" format="fastqsanger,fastqcssanger" label="Right-hand mates" />
+ </inputs>
+ <outputs>
+ <data name="output_file" format="input" />
+ </outputs>
+ <tests>
+ <test>
+ <param name="input1_file" value="paired_end_1.fastqsanger" ftype="fastqsanger" />
+ <param name="input2_file" value="paired_end_2.fastqsanger" ftype="fastqsanger" />
+ <output name="output_file" file="paired_end_merged.fastqsanger" />
+ </test>
+ <test>
+ <param name="input1_file" value="paired_end_1_errors.fastqsanger" ftype="fastqsanger" />
+ <param name="input2_file" value="paired_end_2_errors.fastqsanger" ftype="fastqsanger" />
+ <output name="output_file" file="paired_end_merged_cleaned.fastqsanger" />
+ </test>
+ </tests>
+ <help>
+**What it does**
+
+This tool joins paired end FASTQ reads from two separate files, one with the left mates and one with the right mates, into a single files where letf mates alternate with their right mate. The join is performed using sequence identifiers, allowing the two files to contain differing ordering. If a sequence identifier does not appear in both files, it is excluded from the output.
+
+Sequence identifiers with /1 and /2 appended override the left-hand and right-hand designation; i.e. if the reads end with /1 and /2, the read containing /1 will be used as the left-hand read and the read containing /2 will be used as the right-hand read. Sequences without this designation will follow the left-hand and right-hand settings set by the user.
+
+-----
+
+**Input**
+
+Left-hand mates, for example::
+
+ @1539:931/1
+ ACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
+ +1539:931/1
+ BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+
+Right-hand mates, for example::
+
+ @1539:931/2
+ CGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
+ +1539:931/2
+ WNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+
+-----
+
+**Output**
+
+A multiple-fastq file containing interlaced left and right paired reads::
+
+ @1539:931/1
+ ACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
+ +1539:931/1
+ BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+ @1539:931/2
+ CGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
+ +1539:931/2
+ WNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+
+ </help>
+</tool>
http://bitbucket.org/galaxy/galaxy-central/changeset/bc292ff9d647/
changeset: bc292ff9d647
user: fangly
date: 2011-05-17 09:26:04
summary: Interlacer and de-interlacer now keep track or single reads (that have no mate)
affected #: 7 files (4.0 KB)
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_1_cleaned_singles.fastqsanger Tue May 17 17:26:04 2011 +1000
@@ -0,0 +1,4 @@
+@1539:931/1
+NACATCAACACTCAGTAACGGCTGGCGCAAAATGGCATTGATTAACGAAGACTTCCCGCGCGTGAAGGCGCCGGCAAACGAGGCTCGGGAAGGGGCTCCCG
++1539:931/1
+BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_2_cleaned_singles.fastqsanger Tue May 17 17:26:04 2011 +1000
@@ -0,0 +1,4 @@
+@9999:931/2
+GCGCGTAACGTTTCACCTCGAGATCGTTGTCGGCCGCAATCTCCTGGGGGCGCCATTCCGAATCGTAGTTGTCGGCGTCTTCCAGTGCGGCAAGGCATCGT
++9999:931/2
+aee_dcadeeWcaaadJbdaff[fffc]dcfe[dRc^\[^QVOZXXZSPFWNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/paired_end_merged_cleaned_singles.fastqsanger Tue May 17 17:26:04 2011 +1000
@@ -0,0 +1,16 @@
+@3786:949/1
+NTACCGCGCAACGGCATGATGGCTTGGAACTCACGGTCACGCGCCTGTTTGGCAGAGCCGCCCGCCGAGTCACCTTCCACTAGGAACAGTTCGGAGCGGTT
++3786:949/1
+BKGGKKJNJJ_______W__Y__W_TVPVP[YY[[_____BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@9999:944/1
+NGATCTGGGCTTCAGCAAGACCGATGTCGGCGTGATTGCCAAGCATGCCGGACTCTGGCCGGCGGGGTTCGGCGGTGTGCTGGGTGGCTTGGGGGTGGGGG
++9999:944/1
+BLLLLTWTTR_V_______TYYYRYYYYYY____VWRWWW___BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@9999:949/2
+CTCAACCAGAACACCGTGATCGGCGACCAGTTGGCGCAGTTCGCCATCAGAAATGCAGGGATGCGGATGCGGGCTAGCACGAAAGTCATCCTCAACACGAT
++9999:949/2
+ffcfcaffff\_edefddff[ffa_fRffRdc]Sdf]affehh_eaebBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+@4205:944/2
+GTCGACAGGTGCCTGTACACCACGCCAGGCCAGCCAGGCGAAACCGAGAACGGTCACCATCTGAACCAGACCGAAAACCAACAGTGCGGGGTTGAGCCACG
++4205:944/2
+hhhhhcffcWcdfdcffdffdfQadf[fLfc`Ra`Wcca]`^``]L[^QZGSQWUYZXK[`bJRbZb[_BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
--- a/tools/fastq/fastq_paired_end_deinterlacer.py Fri Dec 17 15:04:17 2010 +1000
+++ b/tools/fastq/fastq_paired_end_deinterlacer.py Tue May 17 17:26:04 2011 +1000
@@ -3,16 +3,20 @@
from galaxy_utils.sequence.fastq import fastqReader, fastqWriter, fastqNamedReader, fastqJoiner
def main():
- input_filename = sys.argv[1]
- input_type = sys.argv[2] or 'sanger'
- mate1_filename = sys.argv[3]
- mate2_filename = sys.argv[4]
+ input_filename = sys.argv[1]
+ input_type = sys.argv[2] or 'sanger'
+ mate1_filename = sys.argv[3]
+ mate2_filename = sys.argv[4]
+ single1_filename = sys.argv[5]
+ single2_filename = sys.argv[6]
- type = input_type
- input = fastqNamedReader( open( input_filename, 'rb' ), format = type )
- out1 = fastqWriter( open( mate1_filename, 'wb' ), format = type )
- out2 = fastqWriter( open( mate2_filename, 'wb' ), format = type )
- joiner = fastqJoiner( type )
+ type = input_type
+ input = fastqNamedReader( open( input_filename, 'rb' ), format = type )
+ mate1_out = fastqWriter( open( mate1_filename, 'wb' ), format = type )
+ mate2_out = fastqWriter( open( mate2_filename, 'wb' ), format = type )
+ single1_out = fastqWriter( open( single1_filename, 'wb' ), format = type )
+ single2_out = fastqWriter( open( single2_filename, 'wb' ), format = type )
+ joiner = fastqJoiner( type )
i = None
skip_count = 0
@@ -26,27 +30,35 @@
mate2 = input.get( joiner.get_paired_identifier( mate1 ) )
if mate2:
+ # This is a mate pair
found[mate2.identifier] = None
if joiner.is_first_mate( mate1 ):
- out1.write( mate1 )
- out2.write( mate2 )
+ mate1_out.write( mate1 )
+ mate2_out.write( mate2 )
else:
- out1.write( mate2 )
- out2.write( mate1 )
+ mate1_out.write( mate2 )
+ mate2_out.write( mate1 )
else:
+ # This is a single
skip_count += 1
+ if joiner.is_first_mate( mate1 ):
+ single1_out.write( mate1 )
+ else:
+ single2_out.write( mate1 )
if i is None:
print "Your input file contained no valid FASTQ sequences."
else:
if skip_count:
- print '%i reads had no mate.' % skip_count
+ print 'There were %i reads with no mate.' % skip_count
print 'De-interlaced %s pairs of sequences.' % ( (i - skip_count + 1)/2 )
input.close()
- out1.close()
- out2.close()
-
+ mate1_out.close()
+ mate2_out.close()
+ single1_out.close()
+ single2_out.close()
+
if __name__ == "__main__":
main()
--- a/tools/fastq/fastq_paired_end_deinterlacer.xml Fri Dec 17 15:04:17 2010 +1000
+++ b/tools/fastq/fastq_paired_end_deinterlacer.xml Tue May 17 17:26:04 2011 +1000
@@ -1,29 +1,35 @@
<tool id="fastq_paired_end_deinterlacer" name="FASTQ de-interlacer" version="1.0.0"><description>on paired end reads</description>
- <command interpreter="python">fastq_paired_end_deinterlacer.py '$input1_file' '${input1_file.extension[len( 'fastq' ):]}' '$output1_file' '$output2_file'</command>
+ <command interpreter="python">fastq_paired_end_deinterlacer.py '$input_file' '${input_file.extension[len( 'fastq' ):]}' '$output1_pairs_file' '$output2_pairs_file' '$output1_singles_file' '$output2_singles_file'</command><inputs>
- <param name="input1_file" type="data" format="fastqsanger,fastqcssanger" label="FASTQ reads" />
+ <param name="input_file" type="data" format="fastqsanger,fastqcssanger" label="FASTQ reads" /></inputs><outputs>
- <data name="output1_file" format="input" />
- <data name="output2_file" format="input" />
+ <data name="output1_pairs_file" format="input" label="FASTQ de-interlacer left mates from data ${input_file.hid}" />
+ <data name="output2_pairs_file" format="input" label="FASTQ de-interlacer right mates from data ${input_file.hid}"/>
+ <data name="output1_singles_file" format="input" label="FASTQ de-interlacer left singles from data ${input_file.hid}"/>
+ <data name="output2_singles_file" format="input" label="FASTQ de-interlacer right singles from data ${input_file.hid}"/></outputs><tests><test>
- <param name="input1_file" value="paired_end_merged.fastqsanger" ftype="fastqsanger" />
- <output name="output1_file" file="paired_end_1.fastqsanger" />
- <output name="output2_file" file="paired_end_2.fastqsanger" />
+ <param name="input_file" value="paired_end_merged.fastqsanger" ftype="fastqsanger" />
+ <output name="output1_pairs_file" file="paired_end_1.fastqsanger" />
+ <output name="output2_pairs_file" file="paired_end_2.fastqsanger" />
+ <output name="output1_singles_file" file="paired_end_1_singles.fastqsanger" />
+ <output name="output2_singles_file" file="paired_end_2_singles.fastqsanger" /></test><test>
- <param name="input1_file" value="paired_end_merged_errors.fastqsanger" ftype="fastqsanger" />
- <output name="output1_file" file="paired_end_1_cleaned.fastqsanger" />
- <output name="output2_file" file="paired_end_2_cleaned.fastqsanger" />
+ <param name="input_file" value="paired_end_merged_errors.fastqsanger" ftype="fastqsanger" />
+ <output name="output1_pairs_file" file="paired_end_1_cleaned.fastqsanger" />
+ <output name="output2_pairs_file" file="paired_end_2_cleaned.fastqsanger" />
+ <output name="output1_singles_file" file="paired_end_1_cleaned_singles.fastqsanger" />
+ <output name="output2_singles_file" file="paired_end_2_cleaned_singles.fastqsanger" /></test></tests><help>
**What it does**
-De-interlaces a single fastq dataset representing paired-end run into two fastq datasets containing only the first or second mate read. Reads without mate are excluded from the output files.
+De-interlaces a single fastq dataset representing paired-end run into two fastq datasets containing only the first or second mate read. Reads without mate are saved in separate output files.
Sequence identifiers for paired-end reads must follow the /1 and /2 convention.
--- a/tools/fastq/fastq_paired_end_interlacer.py Fri Dec 17 15:04:17 2010 +1000
+++ b/tools/fastq/fastq_paired_end_interlacer.py Tue May 17 17:26:04 2011 +1000
@@ -3,11 +3,12 @@
from galaxy_utils.sequence.fastq import fastqReader, fastqWriter, fastqNamedReader, fastqJoiner
def main():
- mate1_filename = sys.argv[1]
- mate1_type = sys.argv[2] or 'sanger'
- mate2_filename = sys.argv[3]
- mate2_type = sys.argv[4] or 'sanger'
- output_filename = sys.argv[5]
+ mate1_filename = sys.argv[1]
+ mate1_type = sys.argv[2] or 'sanger'
+ mate2_filename = sys.argv[3]
+ mate2_type = sys.argv[4] or 'sanger'
+ outfile_pairs = sys.argv[5]
+ outfile_singles = sys.argv[6]
if mate1_type != mate2_type:
print "WARNING: You are trying to interlace files of two different types: %s and %s." % ( mate1_type, mate2_type )
@@ -15,29 +16,43 @@
type = mate1_type
joiner = fastqJoiner( type )
- out = fastqWriter( open( output_filename, 'wb' ), format = type )
- mate_input = fastqNamedReader( open( mate2_filename, 'rb' ), format = type )
+ out_pairs = fastqWriter( open( outfile_pairs, 'wb' ), format = type )
+ out_singles = fastqWriter( open( outfile_singles, 'wb' ), format = type )
+ # Pairs + singles present in mate1
+ nof_singles = 0
+ nof_pairs = 0
+ mate2_input = fastqNamedReader( open( mate2_filename, 'rb' ), format = type )
i = None
- skip_count = 0
for i, mate1 in enumerate( fastqReader( open( mate1_filename, 'rb' ), format = type ) ):
- mate2 = mate_input.get( joiner.get_paired_identifier( mate1 ) )
+ mate2 = mate2_input.get( joiner.get_paired_identifier( mate1 ) )
if mate2:
- out.write( mate1 )
- out.write( mate2 )
+ out_pairs.write( mate1 )
+ out_pairs.write( mate2 )
+ nof_pairs += 1
else:
- skip_count += 1
+ out_singles.write( mate1 )
+ nof_singles += 1
- if i is None:
- print "Your input file contained no valid FASTQ sequences."
+ # Singles present in mate2
+ mate1_input = fastqNamedReader( open( mate1_filename, 'rb' ), format = type )
+ j = None
+ for j, mate2 in enumerate( fastqReader( open( mate2_filename, 'rb' ), format = type ) ):
+ mate1 = mate1_input.get( joiner.get_paired_identifier( mate2 ) )
+ if not mate1:
+ out_singles.write( mate2 )
+ nof_singles += 1
+
+ if (i is None) and (j is None):
+ print "Your input files contained no valid FASTQ sequences."
else:
- not_used_msg = mate_input.has_data()
- if not_used_msg:
- print not_used_msg
- print 'Interlaced %s pairs of sequences.' % ( i - skip_count + 1 )
+ print 'There were %s single reads.' % ( nof_singles )
+ print 'Interlaced %s pairs of sequences.' % ( nof_pairs )
- mate_input.close()
- out.close()
+ mate1_input.close()
+ mate2_input.close()
+ out_pairs.close()
+ out_singles.close()
if __name__ == "__main__":
--- a/tools/fastq/fastq_paired_end_interlacer.xml Fri Dec 17 15:04:17 2010 +1000
+++ b/tools/fastq/fastq_paired_end_interlacer.xml Tue May 17 17:26:04 2011 +1000
@@ -1,29 +1,35 @@
<tool id="fastq_paired_end_interlacer" name="FASTQ interlacer" version="1.0.0"><description>on paired end reads</description>
- <command interpreter="python">fastq_paired_end_interlacer.py '$input1_file' '${input1_file.extension[len( 'fastq' ):]}' '$input2_file' '${input2_file.extension[len( 'fastq' ):]}' '$output_file'</command>
+ <command interpreter="python">fastq_paired_end_interlacer.py '$input1_file' '${input1_file.extension[len( 'fastq' ):]}' '$input2_file' '${input2_file.extension[len( 'fastq' ):]}' '$outfile_pairs' '$outfile_singles'</command><inputs><param name="input1_file" type="data" format="fastqsanger,fastqcssanger" label="Left-hand mates" /><param name="input2_file" type="data" format="fastqsanger,fastqcssanger" label="Right-hand mates" /></inputs><outputs>
- <data name="output_file" format="input" />
+ <!-- $input1_file.name = filename , e.g. paired_end_2_errors.fastqsanger -->
+ <!-- $input1_file.id = ID , e.g. 10 -->
+ <!-- $input1_file.hid = history ID, e.g. 5 -->
+ <data name="outfile_pairs" format="input" label="FASTQ interlacer pairs from data ${input1_file.hid} and data ${input2_file.hid}"/>
+ <data name="outfile_singles" format="input" label="FASTQ interlacer singles from data data ${input1_file.hid} and data ${input2_file.hid}"/></outputs><tests><test><param name="input1_file" value="paired_end_1.fastqsanger" ftype="fastqsanger" /><param name="input2_file" value="paired_end_2.fastqsanger" ftype="fastqsanger" />
- <output name="output_file" file="paired_end_merged.fastqsanger" />
+ <output name="outfile_pairs" file="paired_end_merged.fastqsanger" />
+ <output name="outfile_singles" file="paired_end_merged_singles.fastqsanger" /></test><test><param name="input1_file" value="paired_end_1_errors.fastqsanger" ftype="fastqsanger" /><param name="input2_file" value="paired_end_2_errors.fastqsanger" ftype="fastqsanger" />
- <output name="output_file" file="paired_end_merged_cleaned.fastqsanger" />
+ <output name="outfile_pairs" file="paired_end_merged_cleaned.fastqsanger" />
+ <output name="outfile_singles" file="paired_end_merged_cleaned_singles.fastqsanger" /></test></tests><help>
**What it does**
-This tool joins paired end FASTQ reads from two separate files, one with the left mates and one with the right mates, into a single files where letf mates alternate with their right mate. The join is performed using sequence identifiers, allowing the two files to contain differing ordering. If a sequence identifier does not appear in both files, it is excluded from the output.
+This tool joins paired end FASTQ reads from two separate files, one with the left mates and one with the right mates, into a single files where left mates alternate with their right mates. The join is performed using sequence identifiers, allowing the two files to contain differing ordering. If a sequence identifier does not appear in both files, it is included in a separate file.
Sequence identifiers with /1 and /2 appended override the left-hand and right-hand designation; i.e. if the reads end with /1 and /2, the read containing /1 will be used as the left-hand read and the read containing /2 will be used as the right-hand read. Sequences without this designation will follow the left-hand and right-hand settings set by the user.
@@ -60,5 +66,7 @@
+1539:931/2
WNUUZ\P^`BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
+A multiple-fastq file containing reads that have no mate is also produced.
+
</help></tool>
http://bitbucket.org/galaxy/galaxy-central/changeset/de86763942a3/
changeset: de86763942a3
user: fangly
date: 2011-05-17 09:54:04
summary: Updated tool wrapper version number
affected #: 2 files (4 bytes)
--- a/tools/fastq/fastq_paired_end_deinterlacer.xml Tue May 17 17:26:04 2011 +1000
+++ b/tools/fastq/fastq_paired_end_deinterlacer.xml Tue May 17 17:54:04 2011 +1000
@@ -1,4 +1,4 @@
-<tool id="fastq_paired_end_deinterlacer" name="FASTQ de-interlacer" version="1.0.0">
+<tool id="fastq_paired_end_deinterlacer" name="FASTQ de-interlacer" version="1.1"><description>on paired end reads</description><command interpreter="python">fastq_paired_end_deinterlacer.py '$input_file' '${input_file.extension[len( 'fastq' ):]}' '$output1_pairs_file' '$output2_pairs_file' '$output1_singles_file' '$output2_singles_file'</command><inputs>
--- a/tools/fastq/fastq_paired_end_interlacer.xml Tue May 17 17:26:04 2011 +1000
+++ b/tools/fastq/fastq_paired_end_interlacer.xml Tue May 17 17:54:04 2011 +1000
@@ -1,4 +1,4 @@
-<tool id="fastq_paired_end_interlacer" name="FASTQ interlacer" version="1.0.0">
+<tool id="fastq_paired_end_interlacer" name="FASTQ interlacer" version="1.1"><description>on paired end reads</description><command interpreter="python">fastq_paired_end_interlacer.py '$input1_file' '${input1_file.extension[len( 'fastq' ):]}' '$input2_file' '${input2_file.extension[len( 'fastq' ):]}' '$outfile_pairs' '$outfile_singles'</command><inputs>
http://bitbucket.org/galaxy/galaxy-central/changeset/36ced92e1da8/
changeset: 36ced92e1da8
user: kanwei
date: 2011-06-10 19:15:17
summary: Typo
affected #: 1 file (5 bytes)
--- a/tools/fastq/fastq_paired_end_interlacer.xml Tue May 17 17:54:04 2011 +1000
+++ b/tools/fastq/fastq_paired_end_interlacer.xml Fri Jun 10 13:15:17 2011 -0400
@@ -10,7 +10,7 @@
<!-- $input1_file.id = ID , e.g. 10 --><!-- $input1_file.hid = history ID, e.g. 5 --><data name="outfile_pairs" format="input" label="FASTQ interlacer pairs from data ${input1_file.hid} and data ${input2_file.hid}"/>
- <data name="outfile_singles" format="input" label="FASTQ interlacer singles from data data ${input1_file.hid} and data ${input2_file.hid}"/>
+ <data name="outfile_singles" format="input" label="FASTQ interlacer singles from data ${input1_file.hid} and data ${input2_file.hid}"/></outputs><tests><test>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0