galaxy-dev
Threads by month
- ----- 2025 -----
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- 10008 discussions

29 Aug '09
details: http://www.bx.psu.edu/hg/galaxy/rev/6b924dd68e77
changeset: 2653:6b924dd68e77
user: James Taylor <james(a)jamestaylor.org>
date: Fri Aug 28 18:09:43 2009 -0400
description:
Removing the galalxy.web.framework.servers module (unsued for a long time)
6 file(s) affected in this change:
lib/galaxy/web/framework/servers/__init__.py
lib/galaxy/web/framework/servers/flup/__init__.py
lib/galaxy/web/framework/servers/flup/ajp_forkthreaded.py
lib/galaxy/web/framework/servers/flup/preforkthreadedserver.py
lib/galaxy/web/framework/servers/fork_server.py
lib/galaxy/web/framework/servers/threadpool_server.py
diffs (1154 lines):
diff -r fba947d16fa7 -r 6b924dd68e77 lib/galaxy/web/framework/servers/__init__.py
--- a/lib/galaxy/web/framework/servers/__init__.py Fri Aug 28 18:03:28 2009 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,3 +0,0 @@
-"""
-Various WSGI webserver implementations.
-"""
\ No newline at end of file
diff -r fba947d16fa7 -r 6b924dd68e77 lib/galaxy/web/framework/servers/flup/ajp_forkthreaded.py
--- a/lib/galaxy/web/framework/servers/flup/ajp_forkthreaded.py Fri Aug 28 18:03:28 2009 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,211 +0,0 @@
-# Copyright (c) 2005, 2006 Allan Saddi <allan(a)saddi.com>
-# All rights reserved.
-#
-# Redistribution and use in source and binary forms, with or without
-# modification, are permitted provided that the following conditions
-# are met:
-# 1. Redistributions of source code must retain the above copyright
-# notice, this list of conditions and the following disclaimer.
-# 2. Redistributions in binary form must reproduce the above copyright
-# notice, this list of conditions and the following disclaimer in the
-# documentation and/or other materials provided with the distribution.
-#
-# THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
-# ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
-# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
-# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
-# OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
-# HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
-# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
-# OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
-# SUCH DAMAGE.
-#
-# $Id: ajp_fork.py 2188 2006-12-05 22:11:45Z asaddi $
-
-"""
-ajp - an AJP 1.3/WSGI gateway.
-
-For more information about AJP and AJP connectors for your web server, see
-<http://jakarta.apache.org/tomcat/connectors-doc/>.
-
-For more information about the Web Server Gateway Interface, see
-<http://www.python.org/peps/pep-0333.html>.
-
-Example usage:
-
- #!/usr/bin/env python
- import sys
- from myapplication import app # Assume app is your WSGI application object
- from ajp import WSGIServer
- ret = WSGIServer(app).run()
- sys.exit(ret and 42 or 0)
-
-See the documentation for WSGIServer for more information.
-
-About the bit of logic at the end:
-Upon receiving SIGHUP, the python script will exit with status code 42. This
-can be used by a wrapper script to determine if the python script should be
-re-run. When a SIGINT or SIGTERM is received, the script exits with status
-code 0, possibly indicating a normal exit.
-
-Example wrapper script:
-
- #!/bin/sh
- STATUS=42
- while test $STATUS -eq 42; do
- python "$@" that_script_above.py
- STATUS=$?
- done
-
-Example workers.properties (for mod_jk):
-
- worker.list=foo
- worker.foo.port=8009
- worker.foo.host=localhost
- worker.foo.type=ajp13
-
-Example httpd.conf (for mod_jk):
-
- JkWorkersFile /path/to/workers.properties
- JkMount /* foo
-
-Note that if you mount your ajp application anywhere but the root ("/"), you
-SHOULD specifiy scriptName to the WSGIServer constructor. This will ensure
-that SCRIPT_NAME/PATH_INFO are correctly deduced.
-"""
-
-__author__ = 'Allan Saddi <allan(a)saddi.com>'
-__version__ = '$Revision: 2188 $'
-
-import socket
-import logging
-
-from flup.server.ajp_base import BaseAJPServer, Connection
-from preforkthreadedserver import PreforkThreadedServer
-
-__all__ = ['WSGIServer']
-
-class WSGIServer(BaseAJPServer, PreforkThreadedServer):
- """
- AJP1.3/WSGI server. Runs your WSGI application as a persistant program
- that understands AJP1.3. Opens up a TCP socket, binds it, and then
- waits for forwarded requests from your webserver.
-
- Why AJP? Two good reasons are that AJP provides load-balancing and
- fail-over support. Personally, I just wanted something new to
- implement. :)
-
- Of course you will need an AJP1.3 connector for your webserver (e.g.
- mod_jk) - see <http://jakarta.apache.org/tomcat/connectors-doc/>.
- """
- def __init__(self, application, scriptName='', environ=None,
- bindAddress=('localhost', 8009), allowedServers=None,
- loggingLevel=logging.INFO, debug=True, **kw):
- """
- scriptName is the initial portion of the URL path that "belongs"
- to your application. It is used to determine PATH_INFO (which doesn't
- seem to be passed in). An empty scriptName means your application
- is mounted at the root of your virtual host.
-
- environ, which must be a dictionary, can contain any additional
- environment variables you want to pass to your application.
-
- bindAddress is the address to bind to, which must be a tuple of
- length 2. The first element is a string, which is the host name
- or IPv4 address of a local interface. The 2nd element is the port
- number.
-
- allowedServers must be None or a list of strings representing the
- IPv4 addresses of servers allowed to connect. None means accept
- connections from anywhere.
-
- loggingLevel sets the logging level of the module-level logger.
- """
- BaseAJPServer.__init__(self, application,
- scriptName=scriptName,
- environ=environ,
- multithreaded=False,
- multiprocess=True,
- bindAddress=bindAddress,
- allowedServers=allowedServers,
- loggingLevel=loggingLevel,
- debug=debug)
- for key in ('multithreaded', 'multiprocess', 'jobClass', 'jobArgs'):
- if kw.has_key(key):
- del kw[key]
- PreforkThreadedServer.__init__(self, jobClass=Connection, jobArgs=(self,), **kw)
-
- def run(self):
- """
- Main loop. Call this after instantiating WSGIServer. SIGHUP, SIGINT,
- SIGQUIT, SIGTERM cause it to cleanup and return. (If a SIGHUP
- is caught, this method returns True. Returns False otherwise.)
- """
- self.logger.info('%s starting up', self.__class__.__name__)
-
- try:
- sock = self._setupSocket()
- except socket.error, e:
- self.logger.error('Failed to bind socket (%s), exiting', e[1])
- return False
-
- ret = PreforkThreadedServer.run(self, sock)
-
- self._cleanupSocket(sock)
-
- self.logger.info('%s shutting down%s', self.__class__.__name__,
- self._hupReceived and ' (reload requested)' or '')
-
- return ret
-
-def paste_factory_helper(wsgiServerClass, global_conf, host, port, **local_conf):
- # I think I can't write a tuple for bindAddress in .ini file
- host = host or global_conf.get('host', 'localhost')
- port = port or global_conf.get('port', 4000)
-
- local_conf['bindAddress'] = (host, int(port))
-
- def server(application):
- server = wsgiServerClass(application, **local_conf)
- server.run()
-
- return server
-
-def factory(global_conf, host=None, port=None, **local):
- return paste_factory_helper(WSGIServer, global_conf, host, port, **local)
-
-if __name__ == '__main__':
- def test_app(environ, start_response):
- """Probably not the most efficient example."""
- import cgi
- start_response('200 OK', [('Content-Type', 'text/html')])
- yield '<html><head><title>Hello World!</title></head>\n' \
- '<body>\n' \
- '<p>Hello World!</p>\n' \
- '<table border="1">'
- names = environ.keys()
- names.sort()
- for name in names:
- yield '<tr><td>%s</td><td>%s</td></tr>\n' % (
- name, cgi.escape(`environ[name]`))
-
- form = cgi.FieldStorage(fp=environ['wsgi.input'], environ=environ,
- keep_blank_values=1)
- if form.list:
- yield '<tr><th colspan="2">Form data</th></tr>'
-
- for field in form.list:
- yield '<tr><td>%s</td><td>%s</td></tr>\n' % (
- field.name, field.value)
-
- yield '</table>\n' \
- '</body></html>\n'
-
- from wsgiref import validate
- test_app = validate.validator(test_app)
- # Explicitly set bindAddress to *:4001 for testing.
- WSGIServer(test_app,
- bindAddress=('', 4001), allowedServers=None,
- loggingLevel=logging.DEBUG).run()
diff -r fba947d16fa7 -r 6b924dd68e77 lib/galaxy/web/framework/servers/flup/preforkthreadedserver.py
--- a/lib/galaxy/web/framework/servers/flup/preforkthreadedserver.py Fri Aug 28 18:03:28 2009 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,469 +0,0 @@
-# Copyright (c) 2005 Allan Saddi <allan(a)saddi.com>
-# All rights reserved.
-#
-# Redistribution and use in source and binary forms, with or without
-# modification, are permitted provided that the following conditions
-# are met:
-# 1. Redistributions of source code must retain the above copyright
-# notice, this list of conditions and the following disclaimer.
-# 2. Redistributions in binary form must reproduce the above copyright
-# notice, this list of conditions and the following disclaimer in the
-# documentation and/or other materials provided with the distribution.
-#
-# THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
-# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
-# ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
-# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
-# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
-# OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
-# HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
-# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
-# OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
-# SUCH DAMAGE.
-#
-# $Id: preforkserver.py 2311 2007-01-23 00:05:04Z asaddi $
-
-__author__ = 'Allan Saddi <allan(a)saddi.com>'
-__version__ = '$Revision: 2311 $'
-
-import sys
-import os
-import socket
-import select
-import errno
-import signal
-import threading
-
-try:
- import fcntl
-except ImportError:
- def setCloseOnExec(sock):
- pass
-else:
- def setCloseOnExec(sock):
- fcntl.fcntl(sock.fileno(), fcntl.F_SETFD, fcntl.FD_CLOEXEC)
-
-from flup.server.threadpool import ThreadPool
-
-# If running Python < 2.4, require eunuchs module for socket.socketpair().
-# See <http://www.inoi.fi/open/trac/eunuchs>.
-if not hasattr(socket, 'socketpair'):
- try:
- import eunuchs.socketpair
- except ImportError:
- # TODO: Other alternatives? Perhaps using os.pipe()?
- raise ImportError, 'Requires eunuchs module for Python < 2.4'
-
- def socketpair():
- s1, s2 = eunuchs.socketpair.socketpair()
- p, c = (socket.fromfd(s1, socket.AF_UNIX, socket.SOCK_STREAM),
- socket.fromfd(s2, socket.AF_UNIX, socket.SOCK_STREAM))
- os.close(s1)
- os.close(s2)
- return p, c
-
- socket.socketpair = socketpair
-
-class PreforkThreadedServer(object):
- """
- A preforked server model conceptually similar to Apache httpd(2). At
- any given time, ensures there are at least minSpare children ready to
- process new requests (up to a maximum of maxChildren children total).
- If the number of idle children is ever above maxSpare, the extra
- children are killed.
-
- If maxRequests is positive, each child will only handle that many
- requests in its lifetime before exiting.
-
- jobClass should be a class whose constructor takes at least two
- arguments: the client socket and client address. jobArgs, which
- must be a list or tuple, is any additional (static) arguments you
- wish to pass to the constructor.
-
- jobClass should have a run() method (taking no arguments) that does
- the actual work. When run() returns, the request is considered
- complete and the child process moves to idle state.
- """
- def __init__(self, minSpare=32, maxSpare=32, maxChildren=32,
- maxRequests=0, maxThreads=10, jobClass=None, jobArgs=()):
- self._minSpare = int( minSpare )
- self._maxSpare = int( maxSpare )
- self._maxChildren = max(maxSpare, int( maxChildren ) )
- self._maxRequests = int( maxRequests )
- self._maxThreads = int( maxThreads )
- self._jobClass = jobClass
- self._jobArgs = jobArgs
-
- # Internal state of children. Maps pids to dictionaries with two
- # members: 'file' and 'avail'. 'file' is the socket to that
- # individidual child and 'avail' is whether or not the child is
- # free to process requests.
- self._children = {}
-
- def run(self, sock):
- """
- The main loop. Pass a socket that is ready to accept() client
- connections. Return value will be True or False indiciating whether
- or not the loop was exited due to SIGHUP.
- """
- # Set up signal handlers.
- self._keepGoing = True
- self._hupReceived = False
- self._installSignalHandlers()
-
- # Don't want operations on main socket to block.
- sock.setblocking(0)
-
- # Set close-on-exec
- setCloseOnExec(sock)
-
- # Main loop.
- while self._keepGoing:
- # Maintain minimum number of children.
- while len(self._children) < self._maxSpare:
- if not self._spawnChild(sock): break
-
- # Wait on any socket activity from live children.
- r = [x['file'] for x in self._children.values()
- if x['file'] is not None]
-
- if len(r) == len(self._children):
- timeout = None
- else:
- # There are dead children that need to be reaped, ensure
- # that they are by timing out, if necessary.
- timeout = 2
-
- try:
- r, w, e = select.select(r, [], [], timeout)
- except select.error, e:
- if e[0] != errno.EINTR:
- raise
-
- # Scan child sockets and tend to those that need attention.
- for child in r:
- # Receive status byte.
- try:
- state = child.recv(1)
- except socket.error, e:
- if e[0] in (errno.EAGAIN, errno.EINTR):
- # Guess it really didn't need attention?
- continue
- raise
- # Try to match it with a child. (Do we need a reverse map?)
- for pid,d in self._children.items():
- if child is d['file']:
- if state:
- # Set availability status accordingly.
- self._children[pid]['avail'] = state != '\x00'
- else:
- # Didn't receive anything. Child is most likely
- # dead.
- d = self._children[pid]
- d['file'].close()
- d['file'] = None
- d['avail'] = False
-
- # Reap children.
- self._reapChildren()
-
- # See who and how many children are available.
- availList = filter(lambda x: x[1]['avail'], self._children.items())
- avail = len(availList)
-
- if avail < self._minSpare:
- # Need to spawn more children.
- while avail < self._minSpare and \
- len(self._children) < self._maxChildren:
- if not self._spawnChild(sock): break
- avail += 1
- elif avail > self._maxSpare:
- # Too many spares, kill off the extras.
- pids = [x[0] for x in availList]
- pids.sort()
- pids = pids[self._maxSpare:]
- for pid in pids:
- d = self._children[pid]
- d['file'].close()
- d['file'] = None
- d['avail'] = False
-
- # Clean up all child processes.
- self._cleanupChildren()
-
- # Restore signal handlers.
- self._restoreSignalHandlers()
-
- # Return bool based on whether or not SIGHUP was received.
- return self._hupReceived
-
- def _cleanupChildren(self):
- """
- Closes all child sockets (letting those that are available know
- that it's time to exit). Sends SIGINT to those that are currently
- processing (and hopes that it finishses ASAP).
-
- Any children remaining after 10 seconds is SIGKILLed.
- """
- # Let all children know it's time to go.
- for pid,d in self._children.items():
- if d['file'] is not None:
- d['file'].close()
- d['file'] = None
- if not d['avail']:
- # Child is unavailable. SIGINT it.
- try:
- os.kill(pid, signal.SIGINT)
- except OSError, e:
- if e[0] != errno.ESRCH:
- raise
-
- def alrmHandler(signum, frame):
- pass
-
- # Set up alarm to wake us up after 10 seconds.
- oldSIGALRM = signal.getsignal(signal.SIGALRM)
- signal.signal(signal.SIGALRM, alrmHandler)
- signal.alarm(10)
-
- # Wait for all children to die.
- while len(self._children):
- try:
- pid, status = os.wait()
- except OSError, e:
- if e[0] in (errno.ECHILD, errno.EINTR):
- break
- if self._children.has_key(pid):
- del self._children[pid]
-
- signal.signal(signal.SIGALRM, oldSIGALRM)
-
- # Forcefully kill any remaining children.
- for pid in self._children.keys():
- try:
- os.kill(pid, signal.SIGKILL)
- except OSError, e:
- if e[0] != errno.ESRCH:
- raise
-
- def _reapChildren(self):
- """Cleans up self._children whenever children die."""
- while True:
- try:
- pid, status = os.waitpid(-1, os.WNOHANG)
- except OSError, e:
- if e[0] == errno.ECHILD:
- break
- raise
- if pid <= 0:
- break
- if self._children.has_key(pid): # Sanity check.
- if self._children[pid]['file'] is not None:
- self._children[pid]['file'].close()
- del self._children[pid]
-
- def _spawnChild(self, sock):
- """
- Spawn a single child. Returns True if successful, False otherwise.
- """
- # This socket pair is used for very simple communication between
- # the parent and its children.
- parent, child = socket.socketpair()
- parent.setblocking(0)
- setCloseOnExec(parent)
- child.setblocking(0)
- setCloseOnExec(child)
- try:
- pid = os.fork()
- except OSError, e:
- if e[0] in (errno.EAGAIN, errno.ENOMEM):
- return False # Can't fork anymore.
- raise
- if not pid:
- # Child
- child.close()
- # Put child into its own process group.
- pid = os.getpid()
- os.setpgid(pid, pid)
- # Restore signal handlers.
- self._restoreSignalHandlers()
- # Close copies of child sockets.
- for f in [x['file'] for x in self._children.values()
- if x['file'] is not None]:
- f.close()
- self._children = {}
- try:
- # Enter main loop.
- self._child(sock, parent)
- except KeyboardInterrupt:
- pass
- sys.exit(0)
- else:
- # Parent
- parent.close()
- d = self._children[pid] = {}
- d['file'] = child
- d['avail'] = True
- return True
-
- def _isClientAllowed(self, addr):
- """Override to provide access control."""
- return True
-
- def _child(self, sock, parent):
- """Main loop for children."""
- requestCount = 0
-
- # For the moment we fix the number of threads per process exactly
- threadPool = ThreadPool( minSpare=self._maxThreads,
- maxSpare=self._maxThreads,
- maxThreads=self._maxThreads )
-
- activeThreads = [0]
- activeThreadsLock = threading.Lock()
-
- underCapacity = threading.Event()
- underCapacity.set()
-
- def jobFinished():
- activeThreadsLock.acquire()
- try:
- if activeThreads[0] == self._maxThreads:
- underCapacity.set()
- # Tell parent we're free again.
- try:
- parent.send('\xff')
- except socket.error, e:
- if e[0] == errno.EPIPE:
- # Parent is gone.
- return
- raise
- activeThreads[0] -= 1
- finally:
- activeThreadsLock.release()
-
- class JobClassWrapper( object ):
- def __init__( self, job ):
- self.job = job
- def run( self ):
- self.job.run()
- jobFinished()
-
- while True:
-
- # If all threads are busy, block
- underCapacity.wait()
-
- # Wait for any activity on the main socket or parent socket.
- r, w, e = select.select([sock, parent], [], [])
-
- for f in r:
- # If there's any activity on the parent socket, it
- # means the parent wants us to die or has died itself.
- # Either way, exit.
- if f is parent:
- return
-
- # Otherwise, there's activity on the main socket...
- try:
- clientSock, addr = sock.accept()
- except socket.error, e:
- if e[0] == errno.EAGAIN:
- # Or maybe not.
- continue
- raise
-
- setCloseOnExec(clientSock)
-
- # Check if this client is allowed.
- if not self._isClientAllowed(addr):
- clientSock.close()
- continue
-
- # Notify parent if we're no longer available.
- activeThreadsLock.acquire()
- try:
- activeThreads[0] += 1
- if activeThreads[0] == self._maxThreads:
- # No longer under capacity
- underCapacity.clear()
- # Tell parent
- try:
- parent.send('\x00')
- except socket.error, e:
- # If parent is gone, finish up this request.
- if e[0] != errno.EPIPE:
- raise
- finally:
- activeThreadsLock.release()
-
- ## # Do the job.
- ## self._jobClass(clientSock, addr, *self._jobArgs).run()
-
- ## print "Dispatching job"
-
- # Hand off to Connection.
- conn = JobClassWrapper( self._jobClass(clientSock, addr, *self._jobArgs) )
- # Since we track maxThreads we can allow queueing here, just queues
- # long enough for the callback above to finish.
- if not threadPool.addJob(conn, allowQueuing=True):
- # Should never happen since we track maxThreads carefully
- # outside of the pool
- raise Exception( "Something has gone terribly wrong" )
-
- # If we've serviced the maximum number of requests, exit.
- if self._maxRequests > 0:
- requestCount += 1
- if requestCount >= self._maxRequests:
- # Need to allow threads to finish up here.
- break
-
- # Signal handlers
-
- def _hupHandler(self, signum, frame):
- self._keepGoing = False
- self._hupReceived = True
-
- def _intHandler(self, signum, frame):
- self._keepGoing = False
-
- def _chldHandler(self, signum, frame):
- # Do nothing (breaks us out of select and allows us to reap children).
- pass
-
- def _installSignalHandlers(self):
- supportedSignals = [signal.SIGINT, signal.SIGTERM]
- if hasattr(signal, 'SIGHUP'):
- supportedSignals.append(signal.SIGHUP)
-
- self._oldSIGs = [(x,signal.getsignal(x)) for x in supportedSignals]
-
- for sig in supportedSignals:
- if hasattr(signal, 'SIGHUP') and sig == signal.SIGHUP:
- signal.signal(sig, self._hupHandler)
- else:
- signal.signal(sig, self._intHandler)
-
- def _restoreSignalHandlers(self):
- """Restores previous signal handlers."""
- for signum,handler in self._oldSIGs:
- signal.signal(signum, handler)
-
-if __name__ == '__main__':
- class TestJob(object):
- def __init__(self, sock, addr):
- self._sock = sock
- self._addr = addr
- def run(self):
- print "Client connection opened from %s:%d" % self._addr
- self._sock.send('Hello World!\n')
- self._sock.setblocking(1)
- self._sock.recv(1)
- self._sock.close()
- print "Client connection closed from %s:%d" % self._addr
- sock = socket.socket()
- sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
- sock.bind(('', 8080))
- sock.listen(socket.SOMAXCONN)
- PreforkThreadedServer(maxChildren=10, jobClass=TestJob).run(sock)
diff -r fba947d16fa7 -r 6b924dd68e77 lib/galaxy/web/framework/servers/fork_server.py
--- a/lib/galaxy/web/framework/servers/fork_server.py Fri Aug 28 18:03:28 2009 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,231 +0,0 @@
-"""
-HTTPServer implementation that uses a thread pool based SocketServer (similar
-to the approach used by CherryPy) and the WSGIHandler request handler from
-Paste.
-
-NOTE: NOT HEAVILY TESTED, DO NOT USE IN PRODUCTION!
-"""
-
-import SocketServer
-import Queue
-import threading
-import thread
-import sys
-import socket
-import select
-import os
-import signal
-import errno
-
-import logging
-log = logging.getLogger( __name__ )
-
-import pkg_resources;
-pkg_resources.require( "Paste" )
-from paste.httpserver import WSGIHandler
-
-class ThreadPool( object ):
- """
- Generic thread pool with a queue of callables to consume
- """
- SHUTDOWN = object()
- def __init__( self, nworkers, name="ThreadPool" ):
- """
- Create thread pool with `nworkers` worker threads
- """
- self.nworkers = nworkers
- self.name = name
- self.queue = Queue.Queue()
- self.workers = []
- self.worker_tracker = {}
- for i in range( self.nworkers ):
- worker = threading.Thread( target=self.worker_thread_callback,
- name=( "%s worker %d" % ( self.name, i ) ) )
- worker.start()
- self.workers.append( worker )
- def worker_thread_callback( self ):
- """
- Worker thread should call this method to get and process queued
- callables
- """
- while 1:
- runnable = self.queue.get()
- if runnable is ThreadPool.SHUTDOWN:
- return
- else:
- self.worker_tracker[thread.get_ident()] = [None, None]
- try:
- runnable()
- finally:
- try:
- del self.worker_tracker[thread.get_ident()]
- except KeyError:
- pass
- def shutdown( self ):
- """
- Shutdown the queue (after finishing any pending requests)
- """
- # Add a shutdown request for every worker
- for i in range( self.nworkers ):
- self.queue.put( ThreadPool.SHUTDOWN )
- # Wait for each thread to terminate
- for worker in self.workers:
- worker.join()
-
-class PreforkThreadPoolServer( SocketServer.TCPServer ):
- """
- Server that uses a pool of threads for request handling
- """
- allow_reuse_address = 1
- def __init__( self, server_address, request_handler, nworkers, nprocesses ):
- # Create and start the workers
- self.nprocesses = nprocesses
- self.nworkers = nworkers
- self.running = True
- assert nworkers > 0, "ThreadPoolServer must have at least one worker"
- # Call the base class constructor
- SocketServer.TCPServer.__init__( self, server_address, request_handler )
-
- def get_request( self ):
- self.socket_lock.acquire()
- try:
- return self.socket.accept()
- finally:
- self.socket_lock.release()
-
- def serve_forever(self):
- """
- Overrides `serve_forever` to shutdown cleanly.
- """
- log.info( "Serving requests..." )
- # Pre-fork each child
- children = []
- for i in range( self.nprocesses ):
- pid = os.fork()
- if pid:
- # We are in the parent process
- children.append( pid )
- else:
- # We are in the child process
- signal.signal( signal.SIGINT, self.child_sigint_handler )
- self.time_to_terminate = threading.Event()
- self.socket_lock = threading.Lock()
- self.pid = os.getpid()
- self.serve_forever_child()
- sys.exit( 0 )
- # Wait
- try:
- while len( children ) > 0:
- pid, status = os.wait()
- children.remove( pid )
- except KeyboardInterrupt:
- # Cleanup, kill all children
- print "Killing Children"
- for child in children:
- os.kill( child, signal.SIGINT )
- # Setup and alarm for 10 seconds
- signal.signal( signal.SIGALRM, lambda x, y: None )
- signal.alarm( 10 )
- # Wait
- while len( children ) > 0:
- try:
- pid, status = os.wait()
- children.remove( pid )
- except OSError, e:
- if e[0] in (errno.ECHILD, errno.EINTR):
- break
- # Kill any left
- print "Killing"
- for child in children:
- os.kill( child, signal.SIGKILL )
- log.info( "Shutting down..." )
-
- def serve_forever_child( self ):
- # self.thread_pool = ThreadPool( self.nworkers, "ThreadPoolServer on %s:%d" % self.server_address )
- self.workers = []
- for i in range( self.nworkers ):
- worker = threading.Thread( target=self.serve_forever_thread )
- worker.start()
- self.workers.append( worker )
- self.time_to_terminate.wait()
- print "Terminating"
- for thread in self.workers:
- thread.join()
- self.socket.close()
-
- def serve_forever_thread( self ):
- while self.running:
- self.handle_request()
-
- def child_sigint_handler( self, signum, frame ):
- print "Shutting down child"
- self.shutdown()
-
- def shutdown( self ):
- """
- Finish pending requests and shutdown the server
- """
- self.running = False
- self.time_to_terminate.set()
-
- ## def server_activate(self):
- ## """
- ## Overrides server_activate to set timeout on our listener socket
- ## """
- ## # We set the timeout here so that we can trap ^C on windows
- ## self.socket.settimeout(1)
- ## SocketServer.TCPServer.server_activate(self)
-
-class WSGIPreforkThreadPoolServer( PreforkThreadPoolServer ):
- """
- Server that mixes ThreadPoolServer and WSGIHandler
- """
- def __init__( self, wsgi_application, server_address, *args, **kwargs ):
- PreforkThreadPoolServer.__init__( self, server_address, WSGIHandler, *args, **kwargs )
- self.wsgi_application = wsgi_application
- self.wsgi_socket_timeout = None
- def get_request(self):
- # If there is a socket_timeout, set it on the accepted
- (conn,info) = PreforkThreadPoolServer.get_request(self)
- if self.wsgi_socket_timeout:
- conn.settimeout(self.wsgi_socket_timeout)
- return (conn, info)
-
-
-
-
-
-def serve( wsgi_app, global_conf, host="127.0.0.1", port="8080",
- server_version=None, protocol_version=None, start_loop=True,
- daemon_threads=None, socket_timeout=None, nworkers=10, nprocesses=10 ):
- """
- Similar to `paste.httpserver.serve` but using the thread pool server
- """
- server_address = ( host, int( port ) )
-
- # if server_version:
- # handler.server_version = server_version
- # handler.sys_version = None
- # if protocol_version:
- # assert protocol_version in ('HTTP/0.9','HTTP/1.0','HTTP/1.1')
- # handler.protocol_version = protocol_version
-
- server = WSGIPreforkThreadPoolServer( wsgi_app, server_address, int( nworkers ), int( nprocesses ) )
- if daemon_threads:
- server.daemon_threads = daemon_threads
- if socket_timeout:
- server.wsgi_socket_timeout = int(socket_timeout)
-
- print "serving on %s:%s" % server.server_address
- if start_loop:
- try:
- server.serve_forever()
- except KeyboardInterrupt:
- # allow CTRL+C to shutdown
- pass
- return server
-
-if __name__ == '__main__':
- from paste.wsgilib import dump_environ
- serve(dump_environ, {}, server_version="Wombles/1.0",
- protocol_version="HTTP/1.1", port="8881")
diff -r fba947d16fa7 -r 6b924dd68e77 lib/galaxy/web/framework/servers/threadpool_server.py
--- a/lib/galaxy/web/framework/servers/threadpool_server.py Fri Aug 28 18:03:28 2009 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,219 +0,0 @@
-"""
-HTTPServer implementation that uses a thread pool based SocketServer (similar
-to the approach used by CherryPy) and the WSGIHandler request handler from
-Paste.
-
-NOTE: Most of the improvments from this implementation have been moved into
- the Paste HTTP server, this should be considered deprecated.
-
-Preliminary numbers from "ab -c 50 -n 500 http://localhost:8080/", all tests
-with transaction level logging. Application processes a simple cheetah
-template (using compiled NameMapper).
-
-CherryPy 2.1
-------------
-
-Percentage of the requests served within a certain time (ms)
- 50% 354
- 66% 452
- 75% 601
- 80% 674
- 90% 2868
- 95% 3000
- 98% 3173
- 99% 3361
- 100% 6145 (last request)
-
-Paste with Paste#http server (ThreadingMixIn based)
----------------------------------------------------
-
-Percentage of the requests served within a certain time (ms)
- 50% 84
- 66% 84
- 75% 84
- 80% 84
- 90% 85
- 95% 86
- 98% 92
- 99% 97
- 100% 99 (last request)
-
-This module
------------
-
-Percentage of the requests served within a certain time (ms)
- 50% 19
- 66% 23
- 75% 26
- 80% 29
- 90% 41
- 95% 50
- 98% 70
- 99% 80
- 100% 116 (last request)
-
-"""
-
-import SocketServer
-import Queue
-import threading
-import socket
-
-import logging
-log = logging.getLogger( __name__ )
-
-import pkg_resources;
-pkg_resources.require( "Paste" )
-from paste.httpserver import WSGIHandler
-
-class ThreadPool( object ):
- """
- Generic thread pool with a queue of callables to consume
- """
- SHUTDOWN = object()
- def __init__( self, nworkers, name="ThreadPool" ):
- """
- Create thread pool with `nworkers` worker threads
- """
- self.nworkers = nworkers
- self.name = name
- self.queue = Queue.Queue()
- self.workers = []
- for i in range( self.nworkers ):
- worker = threading.Thread( target=self.worker_thread_callback,
- name=( "%s worker %d" % ( self.name, i ) ) )
- worker.start()
- self.workers.append( worker )
- def worker_thread_callback( self ):
- """
- Worker thread should call this method to get and process queued
- callables
- """
- while 1:
- runnable = self.queue.get()
- if runnable is ThreadPool.SHUTDOWN:
- return
- else:
- runnable()
- def shutdown( self ):
- """
- Shutdown the queue (after finishing any pending requests)
- """
- # Add a shutdown request for every worker
- for i in range( self.nworkers ):
- self.queue.put( ThreadPool.SHUTDOWN )
- # Wait for each thread to terminate
- for worker in self.workers:
- worker.join()
-
-class ThreadPoolServer( SocketServer.TCPServer ):
- """
- Server that uses a pool of threads for request handling
- """
- allow_reuse_address = 1
- def __init__( self, server_address, request_handler, nworkers ):
- # Create and start the workers
- self.running = True
- assert nworkers > 0, "ThreadPoolServer must have at least one worker"
- self.thread_pool = ThreadPool( nworkers, "ThreadPoolServer on %s:%d" % server_address )
- # Call the base class constructor
- SocketServer.TCPServer.__init__( self, server_address, request_handler )
- def process_request( self, request, client_address ):
- """
- Queue the request to be processed by on of the thread pool threads
- """
- # This sets the socket to blocking mode (and no timeout) since it
- # may take the thread pool a little while to get back to it. (This
- # is the default but since we set a timeout on the parent socket so
- # that we can trap interrupts we need to restore this,.)
- request.setblocking( 1 )
- # Queue processing of the request
- self.thread_pool.queue.put( lambda: self.process_request_in_thread( request, client_address ) )
- def process_request_in_thread( self, request, client_address ):
- """
- The worker thread should call back here to do the rest of the
- request processing.
- """
- try:
- self.finish_request( request, client_address )
- self.close_request( request)
- except:
- self.handle_error( request, client_address )
- self.close_request( request )
- def serve_forever(self):
- """
- Overrides `serve_forever` to shutdown cleanly.
- """
- try:
- log.info( "Serving requests..." )
- while self.running:
- try:
- self.handle_request()
- except socket.timeout:
- # Timeout is expected, gives interrupts a chance to
- # propogate, just keep handling
- pass
- log.info( "Shutting down..." )
- finally:
- self.thread_pool.shutdown()
- def shutdown( self ):
- """
- Finish pending requests and shutdown the server
- """
- self.running = False
- self.socket.close()
- def server_activate(self):
- """
- Overrides server_activate to set timeout on our listener socket
- """
- # We set the timeout here so that we can trap ^C on windows
- self.socket.settimeout(1)
- SocketServer.TCPServer.server_activate(self)
-
-class WSGIThreadPoolServer( ThreadPoolServer ):
- """
- Server that mixes ThreadPoolServer and WSGIHandler
- """
- def __init__( self, wsgi_application, server_address, *args, **kwargs ):
- ThreadPoolServer.__init__( self, server_address, WSGIHandler, *args, **kwargs )
- self.wsgi_application = wsgi_application
- self.wsgi_socket_timeout = None
- def get_request(self):
- # If there is a socket_timeout, set it on the accepted
- (conn,info) = ThreadPoolServer.get_request(self)
- if self.wsgi_socket_timeout:
- conn.settimeout(self.wsgi_socket_timeout)
- return (conn, info)
-
-def serve( wsgi_app, global_conf, host="127.0.0.1", port="8080",
- server_version=None, protocol_version=None, start_loop=True,
- daemon_threads=None, socket_timeout=None, nworkers=10 ):
- """
- Similar to `paste.httpserver.serve` but using the thread pool server
- """
- server_address = ( host, int( port ) )
-
- if server_version:
- handler.server_version = server_version
- handler.sys_version = None
- if protocol_version:
- assert protocol_version in ('HTTP/0.9','HTTP/1.0','HTTP/1.1')
- handler.protocol_version = protocol_version
-
- server = WSGIThreadPoolServer( wsgi_app, server_address, int( nworkers ) )
- if daemon_threads:
- server.daemon_threads = daemon_threads
- if socket_timeout:
- server.wsgi_socket_timeout = int(socket_timeout)
-
- print "serving on %s:%s" % server.server_address
- if start_loop:
- try:
- server.serve_forever()
- except KeyboardInterrupt:
- # allow CTRL+C to shutdown
- pass
- return server
-
-
-
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/7aa2475b7fa9
changeset: 2648:7aa2475b7fa9
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Fri Aug 28 15:40:58 2009 -0400
description:
add tagging to HDAs
1 file(s) affected in this change:
templates/dataset/edit_attributes.mako
diffs (74 lines):
diff -r 77dfb7834a94 -r 7aa2475b7fa9 templates/dataset/edit_attributes.mako
--- a/templates/dataset/edit_attributes.mako Fri Aug 28 15:40:32 2009 -0400
+++ b/templates/dataset/edit_attributes.mako Fri Aug 28 15:40:58 2009 -0400
@@ -3,6 +3,50 @@
<%def name="title()">${_('Edit Dataset Attributes')}</%def>
+<%def name="stylesheets()">
+ ${h.css( "base", "history", "autocomplete_tagging" )}
+</%def>
+
+<%def name="javascripts()">
+ ## <!--[if lt IE 7]>
+ ## <script type='text/javascript' src="/static/scripts/IE7.js"> </script>
+ ## <![endif]-->
+ ${h.js( "jquery", "galaxy.base", "jquery.autocomplete", "autocomplete_tagging" )}
+ <script type="text/javascript">
+ $( document ).ready( function() {
+ // Set up autocomplete tagger.
+<%
+ ## Build string of tag name, values.
+ tag_names_and_values = list()
+ for tag in data.tags:
+ tag_name = tag.user_tname
+ tag_value = ""
+ if tag.value is not None:
+ tag_value = tag.user_value
+ tag_names_and_values.append("\"" + tag_name + "\" : \"" + tag_value + "\"")
+%>
+ var options =
+ {
+ tags : {${", ".join(tag_names_and_values)}},
+ tag_click_fn: function(tag) { /* Do nothing. */ },
+ use_toggle_link: false,
+ input_size: 30,
+ in_form: true,
+ <% encoded_data_id = trans.security.encode_id(data.id) %>
+ ajax_autocomplete_tag_url: "${h.url_for( controller='tag', action='tag_autocomplete_data', id=encoded_data_id, item_type="hda" )}",
+ ajax_add_tag_url: "${h.url_for( controller='tag', action='add_tag_async', id=encoded_data_id, item_type="hda" )}",
+ ajax_delete_tag_url: "${h.url_for( controller='tag', action='remove_tag_async', id=encoded_data_id, item_type="hda" )}",
+ delete_tag_img: "${h.url_for('/static/images/delete_tag_icon_gray.png')}",
+ delete_tag_img_rollover: "${h.url_for('/static/images/delete_tag_icon_white.png')}",
+ add_tag_img: "${h.url_for('/static/images/add_icon.png')}",
+ add_tag_img_rollover: "${h.url_for('/static/images/add_icon_dark.png')}",
+ };
+% if trans.get_user() is not None:
+ $("#dataset-tag-area").autocomplete_tagging(options);
+ });
+% endif
+ </script>
+</%def>
<%def name="datatype( dataset, datatypes )">
<select name="datatype">
@@ -38,7 +82,18 @@
<input type="text" name="info" value="${data.info}" size="40"/>
</div>
<div style="clear: both"></div>
- </div>
+ </div>
+ %if trans.get_user() is not None:
+ <div class="form-row">
+ <label>
+ Tags:
+ </label>
+ <div id="dataset-tag-area"
+ style="float: left; margin-left: 1px; width: 295px; margin-right: 10px; border-style: inset; border-color: #ddd; border-width: 1px">
+ </div>
+ <div style="clear: both"></div>
+ </div>
+ %endif
%for name, spec in data.metadata.spec.items():
%if spec.visible:
<div class="form-row">
1
0

29 Aug '09
details: http://www.bx.psu.edu/hg/galaxy/rev/3ef81a4d574e
changeset: 2649:3ef81a4d574e
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Fri Aug 28 15:42:37 2009 -0400
description:
tags module for handling and processing tags
2 file(s) affected in this change:
lib/galaxy/tags/__init__.py
lib/galaxy/tags/tag_handler.py
diffs (238 lines):
diff -r 7aa2475b7fa9 -r 3ef81a4d574e lib/galaxy/tags/__init__.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/tags/__init__.py Fri Aug 28 15:42:37 2009 -0400
@@ -0,0 +1,3 @@
+"""
+Galaxy tagging classes and methods.
+"""
diff -r 7aa2475b7fa9 -r 3ef81a4d574e lib/galaxy/tags/tag_handler.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/tags/tag_handler.py Fri Aug 28 15:42:37 2009 -0400
@@ -0,0 +1,226 @@
+from galaxy.model import Tag, History, HistoryTagAssociation, Dataset, DatasetTagAssociation, HistoryDatasetAssociation, HistoryDatasetAssociationTagAssociation
+import re
+
+class TagHandler( object ):
+
+ # Tag separator.
+ tag_separators = ',;'
+
+ # Hierarchy separator.
+ hierarchy_separator = '.'
+
+ # Key-value separator.
+ key_value_separators = "=:"
+
+ def __init__(self):
+ self.tag_assoc_classes = dict()
+
+ def add_tag_assoc_class(self, entity_class, tag_assoc_class):
+ self.tag_assoc_classes[entity_class] = tag_assoc_class
+
+ def get_tag_assoc_class(self, entity_class):
+ return self.tag_assoc_classes[entity_class]
+
+ # Remove a tag from an item.
+ def remove_item_tag(self, item, tag_name):
+ # Get item tag association.
+ item_tag_assoc = self._get_item_tag_assoc(item, tag_name)
+
+ # Remove association.
+ if item_tag_assoc:
+ # Delete association.
+ item_tag_assoc.delete()
+ item.tags.remove(item_tag_assoc)
+ return True
+
+ return False
+
+ # Delete tags from an item.
+ def delete_item_tags(self, item):
+ # Delete item-tag associations.
+ for tag in item.tags:
+ tag.delete()
+
+ # Delete tags from item.
+ del item.tags[:]
+
+ # Returns true if item is has a given tag.
+ def item_has_tag(self, item, tag_name):
+ # Check for an item-tag association to see if item has a given tag.
+ item_tag_assoc = self._get_item_tag_assoc(item, tag_name)
+ if item_tag_assoc:
+ return True
+ return False
+
+
+ # Apply tags to an item.
+ def apply_item_tags(self, db_session, item, tags_str):
+ # Parse tags.
+ parsed_tags = self._parse_tags(tags_str)
+
+ # Apply each tag.
+ for name, value in parsed_tags.items():
+ # Get or create item-tag association.
+ item_tag_assoc = self._get_item_tag_assoc(item, name)
+ if not item_tag_assoc:
+ #
+ # Create item-tag association.
+ #
+
+ # Create tag; if None, skip the tag (and log error).
+ tag = self._get_or_create_tag(db_session, name)
+ if not tag:
+ # Log error?
+ continue
+
+ # Create tag association based on item class.
+ item_tag_assoc_class = self.tag_assoc_classes[item.__class__]
+ item_tag_assoc = item_tag_assoc_class()
+
+ # Add tag to association.
+ item.tags.append(item_tag_assoc)
+ item_tag_assoc.tag = tag
+
+ # Apply attributes to item-tag association. Strip whitespace from user name and tag.
+ if value:
+ trimmed_value = value.strip()
+ else:
+ trimmed_value = value
+ item_tag_assoc.user_tname = name.strip()
+ item_tag_assoc.user_value = trimmed_value
+ item_tag_assoc.value = self._scrub_tag_value(value)
+
+ # Build a string from an item's tags.
+ def get_tags_str(self, tags):
+ # Return empty string if there are no tags.
+ if not tags:
+ return ""
+
+ # Create string of tags.
+ tags_str_list = list()
+ for tag in tags:
+ tag_str = tag.user_tname
+ if tag.value is not None:
+ tag_str += ":" + tag.user_value
+ tags_str_list.append(tag_str)
+ return ", ".join(tags_str_list)
+
+ # Get a Tag object from a tag string.
+ def _get_tag(self, db_session, tag_str):
+ return db_session.query(Tag).filter(Tag.name==tag_str).first()
+
+ # Create a Tag object from a tag string.
+ def _create_tag(self, db_session, tag_str):
+ tag_hierarchy = tag_str.split(self.__class__.hierarchy_separator)
+ tag_prefix = ""
+ parent_tag = None
+ for sub_tag in tag_hierarchy:
+ # Get or create subtag.
+ tag_name = tag_prefix + self._scrub_tag_name(sub_tag)
+ tag = db_session.query(Tag).filter(Tag.name==tag_name).first()
+ if not tag:
+ tag = Tag(type="generic", name=tag_name)
+
+ # Set tag parent.
+ tag.parent = parent_tag
+
+ # Update parent and tag prefix.
+ parent_tag = tag
+ tag_prefix = tag.name + self.__class__.hierarchy_separator
+ return tag
+
+ # Get or create a Tag object from a tag string.
+ def _get_or_create_tag(self, db_session, tag_str):
+ # Scrub tag; if tag is None after being scrubbed, return None.
+ scrubbed_tag_str = self._scrub_tag_name(tag_str)
+ if not scrubbed_tag_str:
+ return None
+
+ # Get item tag.
+ tag = self._get_tag(db_session, scrubbed_tag_str)
+
+ # Create tag if necessary.
+ if tag is None:
+ tag = self._create_tag(db_session, scrubbed_tag_str)
+
+ return tag
+
+ # Return ItemTagAssociation object for an item and a tag string; returns None if there is
+ # no such tag.
+ def _get_item_tag_assoc(self, item, tag_name):
+ scrubbed_tag_name = self._scrub_tag_name(tag_name)
+ for item_tag_assoc in item.tags:
+ if item_tag_assoc.tag.name == scrubbed_tag_name:
+ return item_tag_assoc
+ return None
+
+ # Returns a list of raw (tag-name, value) pairs derived from a string; method does not scrub tags.
+ # Return value is a dictionary where tag-names are keys.
+ def _parse_tags(self, tag_str):
+ # Gracefully handle None.
+ if not tag_str:
+ return dict()
+
+ # Split tags based on separators.
+ reg_exp = re.compile('[' + self.__class__.tag_separators + ']')
+ raw_tags = reg_exp.split(tag_str)
+
+ # Extract name-value pairs.
+ name_value_pairs = dict()
+ for raw_tag in raw_tags:
+ nv_pair = self._get_name_value_pair(raw_tag)
+ name_value_pairs[nv_pair[0]] = nv_pair[1]
+ return name_value_pairs
+
+ # Scrub a tag value.
+ def _scrub_tag_value(self, value):
+ # Gracefully handle None:
+ if not value:
+ return None
+
+ # Remove whitespace from value.
+ reg_exp = re.compile('\s')
+ scrubbed_value = re.sub(reg_exp, "", value)
+
+ # Lowercase and return.
+ return scrubbed_value.lower()
+
+ # Scrub a tag name.
+ def _scrub_tag_name(self, name):
+ # Gracefully handle None:
+ if not name:
+ return None
+
+ # Remove whitespace from name.
+ reg_exp = re.compile('\s')
+ scrubbed_name = re.sub(reg_exp, "", name)
+
+ # Ignore starting ':' char.
+ if scrubbed_name.startswith(self.__class__.hierarchy_separator):
+ scrubbed_name = scrubbed_name[1:]
+
+ # If name is too short or too long, return None.
+ if len(scrubbed_name) < 3 or len(scrubbed_name) > 255:
+ return None
+
+ # Lowercase and return.
+ return scrubbed_name.lower()
+
+ # Scrub a tag name list.
+ def _scrub_tag_name_list(self, tag_name_list):
+ scrubbed_tag_list = list()
+ for tag in tag_name_list:
+ scrubbed_tag_list.append(self._scrub_tag_name(tag))
+ return scrubbed_tag_list
+
+ # Get name, value pair from a tag string.
+ def _get_name_value_pair(self, tag_str):
+ # Use regular expression to parse name, value.
+ reg_exp = re.compile("[" + self.__class__.key_value_separators + "]")
+ name_value_pair = reg_exp.split(tag_str)
+
+ # Add empty slot if tag does not have value.
+ if len(name_value_pair) < 2:
+ name_value_pair.append(None)
+
+ return name_value_pair
\ No newline at end of file
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/77dfb7834a94
changeset: 2647:77dfb7834a94
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Fri Aug 28 15:40:32 2009 -0400
description:
add tagging to histories
1 file(s) affected in this change:
templates/root/history.mako
diffs (118 lines):
diff -r 817c183aa633 -r 77dfb7834a94 templates/root/history.mako
--- a/templates/root/history.mako Fri Aug 28 15:38:46 2009 -0400
+++ b/templates/root/history.mako Fri Aug 28 15:40:32 2009 -0400
@@ -14,8 +14,8 @@
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta http-equiv="Pragma" content="no-cache">
-${h.css( "base", "history" )}
-${h.js( "jquery", "json2", "jquery.jstore-all" )}
+${h.css( "base", "history", "autocomplete_tagging" )}
+${h.js( "jquery", "json2", "jquery.jstore-all", "jquery.autocomplete", "autocomplete_tagging" )}
<script type="text/javascript">
$(function() {
@@ -83,6 +83,93 @@
%endif
%endfor
});
+
+ //
+ // Set up autocomplete tagger.
+ //
+<%
+ ## Build string of tag name, values.
+ tag_names_and_values = list()
+ for tag in history.tags:
+ tag_name = tag.user_tname
+ tag_value = ""
+ if tag.value is not None:
+ tag_value = tag.user_value
+ tag_names_and_values.append("\"" + tag_name + "\" : \"" + tag_value + "\"")
+%>
+ //
+ // Returns the number of keys (elements) in an array/dictionary.
+ //
+ var array_length = function(an_array)
+ {
+ if (an_array.length)
+ return an_array.length;
+
+ var count = 0;
+ for (element in an_array)
+ count++;
+ return count;
+ };
+
+ //
+ // Function get text to display on the toggle link.
+ //
+ var get_toggle_link_text = function(tags)
+ {
+ var text = "";
+ var num_tags = array_length(tags);
+ if (num_tags != 0)
+ {
+ text = num_tags + (num_tags != 1 ? " Tags" : " Tag");
+ /*
+ // Show first N tags; hide the rest.
+ var max_to_show = 1;
+
+ // Build tag string.
+ var tag_strs = new Array();
+ var count = 0;
+ for (tag_name in tags)
+ {
+ tag_value = tags[tag_name];
+ tag_strs[tag_strs.length] = build_tag_str(tag_name, tag_value);
+ if (++count == max_to_show)
+ break;
+ }
+ tag_str = tag_strs.join(", ");
+
+ // Finalize text.
+ var num_tags_hiding = num_tags - max_to_show;
+ text = "Tags: " + tag_str +
+ (num_tags_hiding != 0 ? " and " + num_tags_hiding + " more" : "");
+ */
+ }
+ else
+ {
+ // No tags.
+ text = "Add tags to this history";
+ }
+ return text;
+ };
+
+ var options =
+ {
+ tags : {${", ".join(tag_names_and_values)}},
+ get_toggle_link_text_fn: get_toggle_link_text,
+ input_size: 15,
+ tag_click_fn: function(tag) { /* Do nothing. */ },
+ <% encoded_history_id = trans.security.encode_id(history.id) %>
+ ajax_autocomplete_tag_url: "${h.url_for( controller='tag', action='tag_autocomplete_data', id=encoded_history_id, item_type="history" )}",
+ ajax_add_tag_url: "${h.url_for( controller='tag', action='add_tag_async', id=encoded_history_id, item_type="history" )}",
+ ajax_delete_tag_url: "${h.url_for( controller='tag', action='remove_tag_async', id=encoded_history_id, item_type="history" )}",
+ delete_tag_img: "${h.url_for('/static/images/delete_tag_icon_gray.png')}",
+ delete_tag_img_rollover: "${h.url_for('/static/images/delete_tag_icon_white.png')}",
+ add_tag_img: "${h.url_for('/static/images/add_icon.png')}",
+ add_tag_img_rollover: "${h.url_for('/static/images/add_icon_dark.png')}",
+ };
+% if trans.get_user() is not None:
+ $("#history-tag-area").autocomplete_tagging(options);
+% endif
+
});
// Functionized so AJAX'd datasets can call them
// Get shown/hidden state from cookie
@@ -290,6 +377,9 @@
<p></p>
%endif
+<div id="history-tag-area" style="margin-bottom: 1em">
+</div>
+
<%namespace file="history_common.mako" import="render_dataset" />
%if not datasets:
1
0

29 Aug '09
details: http://www.bx.psu.edu/hg/galaxy/rev/817c183aa633
changeset: 2646:817c183aa633
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Fri Aug 28 15:38:46 2009 -0400
description:
add autocomplete_tagging.css.tmpl to templates list
1 file(s) affected in this change:
static/june_2007_style/make_style.py
diffs (13 lines):
diff -r 6f43113ab087 -r 817c183aa633 static/june_2007_style/make_style.py
--- a/static/june_2007_style/make_style.py Fri Aug 28 15:36:41 2009 -0400
+++ b/static/june_2007_style/make_style.py Fri Aug 28 15:38:46 2009 -0400
@@ -27,7 +27,8 @@
( "history.css.tmpl", "history.css" ),
( "tool_menu.css.tmpl", "tool_menu.css" ),
( "iphone.css.tmpl", "iphone.css" ),
- ( "reset.css.tmpl", "reset.css" ) ]
+ ( "reset.css.tmpl", "reset.css" ),
+ ( "autocomplete_tagging.css.tmpl", "autocomplete_tagging.css") ]
images = [
( "./gradient.py 9 30 $panel_header_bg_top - $panel_header_bg_bottom 0 0 $panel_header_bg_bottom 1 1", "panel_header_bg.png" ),
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/6f43113ab087
changeset: 2645:6f43113ab087
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Fri Aug 28 15:36:41 2009 -0400
description:
changes to the model to support tags
1 file(s) affected in this change:
lib/galaxy/model/mapping.py
diffs (94 lines):
diff -r b6b4d07ad087 -r 6f43113ab087 lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py Fri Aug 28 15:35:27 2009 -0400
+++ b/lib/galaxy/model/mapping.py Fri Aug 28 15:36:41 2009 -0400
@@ -544,6 +544,34 @@
Column( "content", TEXT )
)
+Tag.table = Table( "tag", metadata,
+ Column( "id", Integer, primary_key=True ),
+ Column( "type", Integer ),
+ Column( "parent_id", Integer, ForeignKey( "tag.id" ) ),
+ Column( "name", TrimmedString(255) ),
+ UniqueConstraint( "name" ) )
+
+HistoryTagAssociation.table = Table( "history_tag_association", metadata,
+ Column( "history_id", Integer, ForeignKey( "history.id" ), index=True ),
+ Column( "tag_id", Integer, ForeignKey( "tag.id" ), index=True ),
+ Column( "user_tname", TrimmedString(255), index=True),
+ Column( "value", TrimmedString(255), index=True),
+ Column( "user_value", TrimmedString(255), index=True) )
+
+DatasetTagAssociation.table = Table( "dataset_tag_association", metadata,
+ Column( "dataset_id", Integer, ForeignKey( "dataset.id" ), index=True ),
+ Column( "tag_id", Integer, ForeignKey( "tag.id" ), index=True ),
+ Column( "user_tname", TrimmedString(255), index=True),
+ Column( "value", TrimmedString(255), index=True),
+ Column( "user_value", TrimmedString(255), index=True) )
+
+HistoryDatasetAssociationTagAssociation.table = Table( "history_dataset_association_tag_association", metadata,
+ Column( "history_dataset_association_id", Integer, ForeignKey( "history_dataset_association.id" ), index=True ),
+ Column( "tag_id", Integer, ForeignKey( "tag.id" ), index=True ),
+ Column( "user_tname", TrimmedString(255), index=True),
+ Column( "value", TrimmedString(255), index=True),
+ Column( "user_value", TrimmedString(255), index=True) )
+
# With the tables defined we can define the mappers and setup the
# relationships between the model objects.
@@ -643,7 +671,8 @@
backref=backref( "parent", primaryjoin=( HistoryDatasetAssociation.table.c.parent_id == HistoryDatasetAssociation.table.c.id ), remote_side=[HistoryDatasetAssociation.table.c.id], uselist=False ) ),
visible_children=relation(
HistoryDatasetAssociation,
- primaryjoin=( ( HistoryDatasetAssociation.table.c.parent_id == HistoryDatasetAssociation.table.c.id ) & ( HistoryDatasetAssociation.table.c.visible == True ) ) )
+ primaryjoin=( ( HistoryDatasetAssociation.table.c.parent_id == HistoryDatasetAssociation.table.c.id ) & ( HistoryDatasetAssociation.table.c.visible == True ) ) ),
+ tags=relation(HistoryDatasetAssociationTagAssociation, backref='history_tag_associations')
) )
assign_mapper( context, Dataset, Dataset.table,
@@ -659,7 +688,8 @@
primaryjoin=( Dataset.table.c.id == LibraryDatasetDatasetAssociation.table.c.dataset_id ) ),
active_library_associations=relation(
LibraryDatasetDatasetAssociation,
- primaryjoin=( ( Dataset.table.c.id == LibraryDatasetDatasetAssociation.table.c.dataset_id ) & ( LibraryDatasetDatasetAssociation.table.c.deleted == False ) ) )
+ primaryjoin=( ( Dataset.table.c.id == LibraryDatasetDatasetAssociation.table.c.dataset_id ) & ( LibraryDatasetDatasetAssociation.table.c.deleted == False ) ) ),
+ tags=relation(DatasetTagAssociation, backref='datasets')
) )
assign_mapper( context, HistoryDatasetAssociationDisplayAtAuthorization, HistoryDatasetAssociationDisplayAtAuthorization.table,
@@ -678,7 +708,8 @@
assign_mapper( context, History, History.table,
properties=dict( galaxy_sessions=relation( GalaxySessionToHistoryAssociation ),
datasets=relation( HistoryDatasetAssociation, backref="history", order_by=asc(HistoryDatasetAssociation.table.c.hid) ),
- active_datasets=relation( HistoryDatasetAssociation, primaryjoin=( ( HistoryDatasetAssociation.table.c.history_id == History.table.c.id ) & ( not_( HistoryDatasetAssociation.table.c.deleted ) ) ), order_by=asc( HistoryDatasetAssociation.table.c.hid ), viewonly=True )
+ active_datasets=relation( HistoryDatasetAssociation, primaryjoin=( ( HistoryDatasetAssociation.table.c.history_id == History.table.c.id ) & ( not_( HistoryDatasetAssociation.table.c.deleted ) ) ), order_by=asc( HistoryDatasetAssociation.table.c.hid ), viewonly=True ),
+ tags=relation(HistoryTagAssociation, backref="histories")
) )
assign_mapper( context, HistoryUserShareAssociation, HistoryUserShareAssociation.table,
@@ -938,6 +969,25 @@
lazy=False )
) )
+assign_mapper( context, Tag, Tag.table,
+ properties=dict( children=relation(Tag, backref=backref( 'parent', remote_side=[Tag.table.c.id] ) )
+ ) )
+
+assign_mapper( context, HistoryTagAssociation, HistoryTagAssociation.table,
+ properties=dict( tag=relation(Tag, backref="tagged_histories") ),
+ primary_key=[HistoryTagAssociation.table.c.history_id, HistoryTagAssociation.table.c.tag_id]
+ )
+
+assign_mapper( context, DatasetTagAssociation, DatasetTagAssociation.table,
+ properties=dict( tag=relation(Tag, backref="tagged_datasets") ),
+ primary_key=[DatasetTagAssociation.table.c.dataset_id, DatasetTagAssociation.table.c.tag_id]
+ )
+
+assign_mapper( context, HistoryDatasetAssociationTagAssociation, HistoryDatasetAssociationTagAssociation.table,
+ properties=dict( tag=relation(Tag, backref="tagged_history_dataset_associations") ),
+ primary_key=[HistoryDatasetAssociationTagAssociation.table.c.history_dataset_association_id, HistoryDatasetAssociationTagAssociation.table.c.tag_id]
+ )
+
def db_next_hid( self ):
"""
Override __next_hid to generate from the database in a concurrency
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/b6b4d07ad087
changeset: 2644:b6b4d07ad087
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Fri Aug 28 15:35:27 2009 -0400
description:
changes to the model to support tags
1 file(s) affected in this change:
lib/galaxy/model/__init__.py
diffs (41 lines):
diff -r 1835c027763a -r b6b4d07ad087 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py Fri Aug 28 15:31:06 2009 -0400
+++ b/lib/galaxy/model/__init__.py Fri Aug 28 15:35:27 2009 -0400
@@ -1142,6 +1142,37 @@
self.user = None
self.title = None
self.content = None
+
+class Tag ( object ):
+ def __init__( self, id=None, type=None, parent_id=None, name=None ):
+ self.id = id
+ self.type = type
+ self.parent_id = parent_id
+ self.name = name
+
+ def __str__ ( self ):
+ return "Tag(id=%s, type=%s, parent_id=%s, name=%s)" % ( self.id, self.type, self.parent_id, self.name )
+
+class ItemTagAssociation ( object ):
+ def __init__( self, item_id=None, tag_id=None, user_tname=None, value=None ):
+ self.item_id = item_id
+ self.tag_id = tag_id
+ self.user_tname = user_tname
+ self.value = None
+ self.user_value = None
+
+ def __str__ ( self ):
+ return "%s(item_id=%s, item_tag=%s, user_tname=%s, value=%s, user_value=%s)" % (self.__class__.__name__, self.item_id, self.tag_id, self.user_tname, self.value. self.user_value )
+
+
+class HistoryTagAssociation ( ItemTagAssociation ):
+ pass
+
+class DatasetTagAssociation ( ItemTagAssociation ):
+ pass
+
+class HistoryDatasetAssociationTagAssociation ( ItemTagAssociation ):
+ pass
1
0

29 Aug '09
details: http://www.bx.psu.edu/hg/galaxy/rev/cc4944a62b66
changeset: 2640:cc4944a62b66
user: Greg Von Kuster <greg(a)bx.psu.edu>
date: Fri Aug 28 16:52:58 2009 -0400
description:
Performance improvements in the GalaxyRBACAgent for checking access permissions on libraries. Also added information to main config regarding mysql timeouts ( resolves ticket # 132 ).
27 file(s) affected in this change:
lib/galaxy/model/__init__.py
lib/galaxy/model/mapping.py
lib/galaxy/security/__init__.py
lib/galaxy/tools/actions/__init__.py
lib/galaxy/tools/parameters/basic.py
lib/galaxy/web/controllers/admin.py
lib/galaxy/web/controllers/dataset.py
lib/galaxy/web/controllers/history.py
lib/galaxy/web/controllers/library.py
lib/galaxy/web/controllers/root.py
templates/admin/library/browse_library.mako
templates/dataset/edit_attributes.mako
templates/library/browse_library.mako
templates/library/common.mako
templates/library/folder_info.mako
templates/library/folder_permissions.mako
templates/library/ldda_edit_info.mako
templates/library/ldda_info.mako
templates/library/library_dataset_info.mako
templates/library/library_dataset_permissions.mako
templates/library/library_info.mako
templates/library/library_permissions.mako
templates/mobile/history/detail.mako
templates/mobile/manage_library.mako
templates/root/history_common.mako
test/base/twilltestcase.py
universe_wsgi.ini.sample
diffs (1463 lines):
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/model/__init__.py Fri Aug 28 16:52:58 2009 -0400
@@ -723,15 +723,19 @@
return None
@property
def active_components( self ):
- return list( self.active_folders ) + list( self.active_datasets )
+ return list( self.active_folders ) + list( self.active_library_datasets )
+ @property
+ def active_library_datasets( self ):
+ # This needs to be a list
+ return [ ld for ld in self.datasets if not ld.library_dataset_dataset_association.deleted ]
+ @property
+ def activatable_library_datasets( self ):
+ # This needs to be a list
+ return [ ld for ld in self.datasets if not ld.library_dataset_dataset_association.dataset.deleted ]
@property
def active_datasets( self ):
# This needs to be a list
- return [ ld for ld in self.datasets if not ld.library_dataset_dataset_association.deleted ]
- @property
- def activatable_datasets( self ):
- # This needs to be a list
- return [ ld for ld in self.datasets if not ld.library_dataset_dataset_association.dataset.deleted ]
+ return [ ld.library_dataset_dataset_association.dataset for ld in self.datasets if not ld.library_dataset_dataset_association.deleted ]
@property #make this a relation
def activatable_folders( self ):
# This needs to be a list
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/model/mapping.py Fri Aug 28 16:52:58 2009 -0400
@@ -740,7 +740,7 @@
assign_mapper( context, DatasetPermissions, DatasetPermissions.table,
properties=dict(
dataset=relation( Dataset, backref="actions" ),
- role=relation( Role, backref="actions" )
+ role=relation( Role, backref="dataset_actions" )
)
)
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/security/__init__.py
--- a/lib/galaxy/security/__init__.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/security/__init__.py Fri Aug 28 16:52:58 2009 -0400
@@ -33,8 +33,10 @@
def get_actions( self ):
"""Get all permitted actions as a list of Action objects"""
return self.permitted_actions.__dict__.values()
- def allow_action( self, user, action, **kwd ):
+ def allow_action( self, user, roles, action, **kwd ):
raise 'No valid method of checking action (%s) on %s for user %s.' % ( action, kwd, user )
+ def get_item_action( self, action, item ):
+ raise 'No valid method of retrieving action (%s) for item %s.' % ( action, item )
def guess_derived_permissions_for_datasets( self, datasets = [] ):
raise "Unimplemented Method"
def associate_components( self, **kwd ):
@@ -79,40 +81,48 @@
( self.model.LibraryFolder, self.model.LibraryFolderPermissions ),
( self.model.LibraryDataset, self.model.LibraryDatasetPermissions ),
( self.model.LibraryDatasetDatasetAssociation, self.model.LibraryDatasetDatasetAssociationPermissions ) )
- def allow_action( self, user, action, **kwd ):
+ @property
+ def sa_session( self ):
+ """
+ Returns a SQLAlchemy session -- currently just gets the current
+ session from the threadlocal session context, but this is provided
+ to allow migration toward a more SQLAlchemy 0.4 style of use.
+ """
+ return self.model.context.current
+ def allow_action( self, user, roles, action, **kwd ):
if 'dataset' in kwd:
- return self.allow_dataset_action( user, action, kwd[ 'dataset' ] )
+ return self.allow_dataset_action( user, roles, action, kwd[ 'dataset' ] )
elif 'library_item' in kwd:
- return self.allow_library_item_action( user, action, kwd[ 'library_item' ] )
+ return self.allow_library_item_action( user, roles, action, kwd[ 'library_item' ] )
raise 'No valid method of checking action (%s) for user %s using kwd %s' % ( action, str( user ), str( kwd ) )
- def allow_dataset_action( self, user, action, dataset ):
+ def allow_dataset_action( self, user, roles, action, dataset ):
"""Returns true when user has permission to perform an action"""
- if not isinstance( dataset, self.model.Dataset ):
- dataset = dataset.dataset
if not user:
if action == self.permitted_actions.DATASET_ACCESS and action.action not in [ dp.action for dp in dataset.actions ]:
- return True # anons only get access, and only if there are no roles required for the access action
- # other actions (or if the dataset has roles defined for the access action) fall through to the false below
+ # anons only get access, and only if there are no roles required for the access action
+ # Other actions (or if the dataset has roles defined for the access action) fall through
+ # to the false below
+ return True
elif action.action not in [ dp.action for dp in dataset.actions ]:
if action.model == 'restrict':
- return True # implicit access to restrict-style actions if the dataset does not have the action
- # grant-style actions fall through to the false below
+ # Implicit access to restrict-style actions if the dataset does not have the action
+ # Grant style actions fall through to the false below
+ return True
else:
- user_role_ids = sorted( [ r.id for r in user.all_roles() ] )
perms = self.get_dataset_permissions( dataset )
if action in perms.keys():
# The filter() returns a list of the dataset's role ids of which the user is not a member,
# so an empty list means the user has all of the required roles.
- if not filter( lambda x: x not in user_role_ids, [ r.id for r in perms[ action ] ] ):
- return True # user has all of the roles required to perform the action
- # Fall through to the false because the user is missing at least one required role
- return False # default is to reject
- def allow_library_item_action( self, user, action, library_item ):
+ if not filter( lambda x: x not in roles, [ r for r in perms[ action ] ] ):
+ # User has all of the roles required to perform the action
+ return True
+ # The user is missing at least one required role
+ return False
+ def allow_library_item_action( self, user, roles, action, library_item ):
if user is None:
# All permissions are granted, so non-users cannot have permissions
return False
if action.model == 'grant':
- user_role_ids = [ r.id for r in user.all_roles() ]
# Check to see if user has access to any of the roles
allowed_role_assocs = []
for item_class, permission_class in self.library_item_assocs:
@@ -126,11 +136,17 @@
elif permission_class == self.model.LibraryDatasetDatasetAssociationPermissions:
allowed_role_assocs = permission_class.filter_by( action=action.action, library_dataset_dataset_association_id=library_item.id ).all()
for allowed_role_assoc in allowed_role_assocs:
- if allowed_role_assoc.role_id in user_role_ids:
+ if allowed_role_assoc.role in roles:
return True
return False
else:
raise 'Unimplemented model (%s) specified for action (%s)' % ( action.model, action.action )
+ def get_item_action( self, action, item ):
+ # item must be one of: Dataset, Library, LibraryFolder, LibraryDataset, LibraryDatasetDatasetAssociation
+ for permission in item.actions:
+ if permission.action == action:
+ return permission
+ return None
def guess_derived_permissions_for_datasets( self, datasets=[] ):
"""Returns a dict of { action : [ role, role, ... ] } for the output dataset based upon provided datasets"""
perms = {}
@@ -260,7 +276,10 @@
if [ assoc for assoc in dataset.history_associations if assoc.history not in user.histories ]:
# Don't change permissions on a dataset associated with a history not owned by the user
continue
- if bypass_manage_permission or self.allow_action( user, self.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset=dataset ):
+ if bypass_manage_permission or self.allow_action( user,
+ user.all_roles(),
+ self.permitted_actions.DATASET_MANAGE_PERMISSIONS,
+ dataset=dataset ):
self.set_all_dataset_permissions( dataset, permissions )
def history_get_default_permissions( self, history ):
permissions = {}
@@ -402,17 +421,18 @@
lp = permission_class( action.action, target_library_item, private_role )
lp.flush()
else:
- raise 'Invalid class (%s) specified for target_library_item (%s)' % ( target_library_item.__class__, target_library_item.__class__.__name__ )
- def show_library_item( self, user, library_item ):
- if self.allow_action( user, self.permitted_actions.LIBRARY_MODIFY, library_item=library_item ) or \
- self.allow_action( user, self.permitted_actions.LIBRARY_MANAGE, library_item=library_item ) or \
- self.allow_action( user, self.permitted_actions.LIBRARY_ADD, library_item=library_item ):
+ raise 'Invalid class (%s) specified for target_library_item (%s)' % \
+ ( target_library_item.__class__, target_library_item.__class__.__name__ )
+ def show_library_item( self, user, roles, library_item ):
+ if self.allow_action( user, roles, self.permitted_actions.LIBRARY_MODIFY, library_item=library_item ) or \
+ self.allow_action( user, roles, self.permitted_actions.LIBRARY_MANAGE, library_item=library_item ) or \
+ self.allow_action( user, roles, self.permitted_actions.LIBRARY_ADD, library_item=library_item ):
return True
if isinstance( library_item, self.model.Library ):
- return self.show_library_item( user, library_item.root_folder )
+ return self.show_library_item( user, roles, library_item.root_folder )
elif isinstance( library_item, self.model.LibraryFolder ):
for folder in library_item.folders:
- if self.show_library_item( user, folder ):
+ if self.show_library_item( user, roles, folder ):
return True
return False
def set_entity_user_associations( self, users=[], roles=[], groups=[], delete_existing_assocs=True ):
@@ -462,26 +482,30 @@
if 'role' in kwd:
return self.model.GroupRoleAssociation.filter_by( role_id = kwd['role'].id, group_id = kwd['group'].id ).first()
raise 'No valid method of associating provided components: %s' % kwd
- def check_folder_contents( self, user, entry ):
+ def check_folder_contents( self, user, roles, folder ):
"""
- Return true if there are any datasets under 'folder' that the
- user has access permission on. We do this a lot and it's a
- pretty inefficient method, optimizations are welcomed.
+ Return true if there are any datasets under 'folder' that are public or that the
+ user has access permission on.
"""
- if isinstance( entry, self.model.Library ):
- return self.check_folder_contents( user, entry.root_folder )
- elif isinstance( entry, self.model.LibraryFolder ):
- for library_dataset in entry.active_datasets:
- if self.allow_action( user, self.permitted_actions.DATASET_ACCESS, dataset=library_dataset.library_dataset_dataset_association.dataset ):
- return True
- for folder in entry.active_folders:
- if self.check_folder_contents( user, folder ):
- return True
- return False
- elif isinstance( entry, self.model.LibraryDatasetDatasetAssociation ):
- return self.allow_action( user, self.permitted_actions.DATASET_ACCESS, dataset=entry.dataset )
- else:
- raise 'Passed an illegal object to check_folder_contents: %s' % type( entry )
+ action = self.permitted_actions.DATASET_ACCESS.action
+ lddas = self.sa_session.query( self.model.LibraryDatasetDatasetAssociation ) \
+ .join( "library_dataset" ) \
+ .filter( self.model.LibraryDataset.folder == folder ) \
+ .join( "dataset" ) \
+ .options( eagerload_all( "dataset.actions" ) ) \
+ .all()
+ for ldda in lddas:
+ ldda_access = self.get_item_action( action, ldda.dataset )
+ if ldda_access is None:
+ # Dataset is public
+ return True
+ if ldda_access.role in roles:
+ # The current user has access permission on the dataset
+ return True
+ for sub_folder in folder.active_folders:
+ if self.check_folder_contents( user, roles, sub_folder ):
+ return True
+ return False
class HostAgent( RBACAgent ):
"""
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/tools/actions/__init__.py
--- a/lib/galaxy/tools/actions/__init__.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/tools/actions/__init__.py Fri Aug 28 16:52:58 2009 -0400
@@ -47,8 +47,15 @@
assoc.dataset = new_data
assoc.flush()
data = new_data
- # TODO, Nate: Make sure the permitted actions here are appropriate.
- if data and not trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset=data ):
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+ if data and not trans.app.security_agent.allow_action( user,
+ roles,
+ data.permitted_actions.DATASET_ACCESS,
+ dataset=data.dataset ):
raise "User does not have permission to use a dataset (%s) provided for input." % data.id
return data
if isinstance( input, DataToolParameter ):
@@ -261,10 +268,17 @@
# parameters to the command as a special case.
for name, value in tool.params_to_strings( incoming, trans.app ).iteritems():
job.add_parameter( name, value )
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
for name, dataset in inp_data.iteritems():
if dataset:
- # TODO, Nate: Make sure the permitted actions here are appropriate.
- if not trans.app.security_agent.allow_action( trans.user, dataset.permitted_actions.DATASET_ACCESS, dataset=dataset ):
+ if not trans.app.security_agent.allow_action( user,
+ roles,
+ dataset.permitted_actions.DATASET_ACCESS,
+ dataset=dataset.dataset ):
raise "User does not have permission to use a dataset (%s) provided for input." % data.id
job.add_input_dataset( name, dataset )
else:
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/tools/parameters/basic.py Fri Aug 28 16:52:58 2009 -0400
@@ -1137,6 +1137,11 @@
field = form_builder.SelectField( self.name, self.multiple, None, self.refresh_on_change, refresh_on_change_values = self.refresh_on_change_values )
# CRUCIAL: the dataset_collector function needs to be local to DataToolParameter.get_html_field()
def dataset_collector( hdas, parent_hid ):
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
for i, hda in enumerate( hdas ):
if len( hda.name ) > 30:
hda_name = '%s..%s' % ( hda.name[:17], hda.name[-11:] )
@@ -1148,12 +1153,18 @@
hid = str( hda.hid )
if not hda.dataset.state in [galaxy.model.Dataset.states.ERROR, galaxy.model.Dataset.states.DISCARDED] and \
hda.visible and \
- trans.app.security_agent.allow_action( trans.user, hda.permitted_actions.DATASET_ACCESS, dataset=hda ):
+ trans.app.security_agent.allow_action( user,
+ roles,
+ hda.permitted_actions.DATASET_ACCESS,
+ dataset=hda.dataset ):
# If we are sending data to an external application, then we need to make sure there are no roles
# associated with the dataset that restrict it's access from "public". We determine this by sending
# None as the user to the allow_action method.
if self.tool and self.tool.tool_type == 'data_destination':
- if not trans.app.security_agent.allow_action( None, hda.permitted_actions.DATASET_ACCESS, dataset=hda ):
+ if not trans.app.security_agent.allow_action( None,
+ None,
+ hda.permitted_actions.DATASET_ACCESS,
+ dataset=hda.dataset ):
continue
if self.options and hda.get_dbkey() != filter_value:
continue
@@ -1165,7 +1176,10 @@
if target_ext:
if converted_dataset:
hda = converted_dataset
- if not trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.DATASET_ACCESS, dataset=hda.dataset ):
+ if not trans.app.security_agent.allow_action( user,
+ roles,
+ trans.app.security_agent.permitted_actions.DATASET_ACCESS,
+ dataset=hda.dataset ):
continue
selected = ( value and ( hda in value ) )
field.add_option( "%s: (as %s) %s" % ( hid, target_ext, hda_name ), hda.id, selected )
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/web/controllers/admin.py
--- a/lib/galaxy/web/controllers/admin.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/web/controllers/admin.py Fri Aug 28 16:52:58 2009 -0400
@@ -269,7 +269,7 @@
gra.delete()
gra.flush()
# Delete DatasetPermissionss
- for dp in role.actions:
+ for dp in role.dataset_actions:
dp.delete()
dp.flush()
msg = "The following have been purged from the database for role '%s': " % role.name
@@ -997,16 +997,8 @@
msg=msg,
messagetype=messagetype )
elif action == 'delete':
- def delete_folder( folder ):
- folder.refresh()
- for subfolder in folder.active_folders:
- delete_folder( subfolder )
- for ldda in folder.active_datasets:
- ldda.deleted = True
- ldda.flush()
- folder.deleted = True
- folder.flush()
- delete_folder( folder )
+ folder.deleted = True
+ folder.flush()
msg = "Folder '%s' and all of its contents have been marked deleted" % folder.name
return trans.response.send_redirect( web.url_for( action='browse_library',
id=library_id,
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/web/controllers/dataset.py
--- a/lib/galaxy/web/controllers/dataset.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/web/controllers/dataset.py Fri Aug 28 16:52:58 2009 -0400
@@ -108,7 +108,15 @@
data = trans.app.model.HistoryDatasetAssociation.get( dataset_id )
if not data:
raise paste.httpexceptions.HTTPRequestRangeNotSatisfiable( "Invalid reference dataset id: %s." % str( dataset_id ) )
- if trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset = data ):
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ data.permitted_actions.DATASET_ACCESS,
+ dataset=data.dataset ):
if data.state == trans.model.Dataset.states.UPLOAD:
return trans.show_error_message( "Please wait until this dataset finishes uploading before attempting to view it." )
if filename is None or filename.lower() == "index":
@@ -142,9 +150,17 @@
if 'display_url' not in kwd or 'redirect_url' not in kwd:
return trans.show_error_message( 'Invalid parameters specified for "display at" link, please contact a Galaxy administrator' )
redirect_url = kwd['redirect_url'] % urllib.quote_plus( kwd['display_url'] )
- if trans.app.security_agent.allow_action( None, data.permitted_actions.DATASET_ACCESS, dataset = data ):
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+ if trans.app.security_agent.allow_action( None, None, data.permitted_actions.DATASET_ACCESS, dataset=data.dataset ):
return trans.response.send_redirect( redirect_url ) # anon access already permitted by rbac
- if trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset = data ):
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ data.permitted_actions.DATASET_ACCESS,
+ dataset=data.dataset ):
trans.app.host_security_agent.set_dataset_permissions( data, trans.user, site )
return trans.response.send_redirect( redirect_url )
else:
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/web/controllers/history.py
--- a/lib/galaxy/web/controllers/history.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/web/controllers/history.py Fri Aug 28 16:52:58 2009 -0400
@@ -411,6 +411,7 @@
err_msg=err_msg,
share_button=True ) )
user = trans.get_user()
+ user_roles = user.all_roles()
histories, send_to_users, send_to_err = self._get_histories_and_users( trans, user, id, email )
send_to_err = ''
# The user has made a choice, so dictionaries will be built for sharing
@@ -442,13 +443,15 @@
for hda in history.activatable_datasets:
# If the current dataset is not public, we may need to perform an action on it to
# make it accessible by the other user.
- if not trans.app.security_agent.allow_action( send_to_user,
+ if not trans.app.security_agent.allow_action( send_to_user,
+ send_to_user.all_roles(),
trans.app.security_agent.permitted_actions.DATASET_ACCESS,
- dataset=hda ):
+ dataset=hda.dataset ):
# The user with which we are sharing the history does not have access permission on the current dataset
- if trans.app.security_agent.allow_action( user,
+ if trans.app.security_agent.allow_action( user,
+ user_roles,
trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS,
- dataset=hda ) and not hda.dataset.library_associations:
+ dataset=hda.dataset ) and not hda.dataset.library_associations:
# The current user has authority to change permissions on the current dataset because
# they have permission to manage permissions on the dataset and the dataset is not associated
# with a library.
@@ -525,6 +528,7 @@
cannot_change = {}
no_change_needed = {}
unique_no_change_needed = {}
+ user_roles = user.all_roles()
for history in histories:
for send_to_user in send_to_users:
# Make sure the current history has not already been shared with the current send_to_user
@@ -552,13 +556,15 @@
no_change_needed[ send_to_user ][ history ] = [ hda ]
else:
no_change_needed[ send_to_user ][ history ].append( hda )
- elif not trans.app.security_agent.allow_action( send_to_user,
+ elif not trans.app.security_agent.allow_action( send_to_user,
+ send_to_user.all_roles(),
trans.app.security_agent.permitted_actions.DATASET_ACCESS,
- dataset=hda ):
+ dataset=hda.dataset ):
# The user with which we are sharing the history does not have access permission on the current dataset
- if trans.app.security_agent.allow_action( user,
+ if trans.app.security_agent.allow_action( user,
+ user_roles,
trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS,
- dataset=hda ) and not hda.dataset.library_associations:
+ dataset=hda.dataset ) and not hda.dataset.library_associations:
# The current user has authority to change permissions on the current dataset because
# they have permission to manage permissions on the dataset and the dataset is not associated
# with a library.
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/web/controllers/library.py
--- a/lib/galaxy/web/controllers/library.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/web/controllers/library.py Fri Aug 28 16:52:58 2009 -0400
@@ -62,14 +62,29 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- all_libraries = trans.app.model.Library.filter( trans.app.model.Library.table.c.deleted==False ).order_by( trans.app.model.Library.name ).all()
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+ all_libraries = trans.app.model.Library.filter( trans.app.model.Library.table.c.deleted==False ) \
+ .order_by( trans.app.model.Library.name ).all()
authorized_libraries = []
for library in all_libraries:
- if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library ) or \
- trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library ) or \
- trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=library ) or \
- trans.app.security_agent.check_folder_contents( trans.user, library ) or \
- trans.app.security_agent.show_library_item( trans.user, library ):
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ trans.app.security_agent.permitted_actions.LIBRARY_ADD,
+ library_item=library ) or \
+ trans.app.security_agent.allow_action( user,
+ roles,
+ trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
+ library_item=library ) or \
+ trans.app.security_agent.allow_action( user,
+ roles,
+ trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
+ library_item=library ) or \
+ trans.app.security_agent.check_folder_contents( user, roles, library.root_folder ) or \
+ trans.app.security_agent.show_library_item( user, roles, library ):
authorized_libraries.append( library )
return trans.fill_template( '/library/browse_libraries.mako',
libraries=authorized_libraries,
@@ -264,9 +279,15 @@
msg=util.sanitize_text( msg ),
messagetype='error' ) )
seen = []
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
for id in ldda_ids:
ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( id )
- if not ldda or not trans.app.security_agent.allow_action( trans.user,
+ if not ldda or not trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.DATASET_ACCESS,
dataset = ldda.dataset ):
continue
@@ -363,9 +384,15 @@
id=library_id,
msg=util.sanitize_text( msg ),
messagetype='error' ) )
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
if action == 'information':
if params.get( 'edit_attributes_button', False ):
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=library_dataset ):
if params.get( 'edit_attributes_button', False ):
@@ -391,7 +418,8 @@
messagetype=messagetype )
elif action == 'permissions':
if params.get( 'update_roles_button', False ):
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
library_item=library_dataset ):
# The user clicked the Save button on the 'Associate With Roles' form
@@ -436,6 +464,11 @@
last_used_build = replace_dataset.library_dataset_dataset_association.dbkey
else:
replace_dataset = None
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
# Let's not overwrite the imported datatypes module with the variable datatypes?
# The built-in 'id' is overwritten in lots of places as well
ldatatypes = [ dtype_name for dtype_name, dtype_value in trans.app.datatypes_registry.datatypes_by_extension.iteritems() if dtype_value.allow_datatype_change ]
@@ -479,10 +512,12 @@
if action == 'permissions':
if params.get( 'update_roles_button', False ):
# The user clicked the Save button on the 'Associate With Roles' form
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
library_item=ldda ) and \
- trans.app.security_agent.allow_action( trans.user,
+ trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS,
dataset=ldda.dataset ):
permissions = {}
@@ -523,7 +558,8 @@
elif action == 'edit_info':
if params.get( 'change', False ):
# The user clicked the Save button on the 'Change data type' form
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=ldda ):
if ldda.datatype.allow_datatype_change and trans.app.datatypes_registry.get_datatype_by_extension( params.datatype ).allow_datatype_change:
@@ -546,7 +582,8 @@
messagetype=messagetype )
elif params.get( 'save', False ):
# The user clicked the Save button on the 'Edit Attributes' form
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=ldda ):
old_name = ldda.name
@@ -587,7 +624,8 @@
messagetype=messagetype )
elif params.get( 'detect', False ):
# The user clicked the Auto-detect button on the 'Edit Attributes' form
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=ldda ):
for name, spec in ldda.datatype.metadata_spec.items():
@@ -611,7 +649,8 @@
msg=msg,
messagetype=messagetype )
elif params.get( 'delete', False ):
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=folder ):
ldda.deleted = True
@@ -628,7 +667,8 @@
widgets=widgets,
msg=msg,
messagetype=messagetype )
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=ldda ):
ldda.datatype.before_edit( ldda )
@@ -668,10 +708,12 @@
messagetype='error' ) )
if action == 'permissions':
if params.get( 'update_roles_button', False ):
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
library_item=ldda ) and \
- trans.app.security_agent.allow_action( trans.user,
+ trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS,
dataset=ldda.dataset ):
permissions = {}
@@ -704,10 +746,12 @@
library_id=library_id,
msg=msg,
messagetype=messagetype )
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
library_item=ldda ) and \
- trans.app.security_agent.allow_action( trans.user,
+ trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS,
dataset=ldda.dataset ):
# Ensure that the permissions across all library items are identical, otherwise we can't update them together.
@@ -741,10 +785,12 @@
library_id=library_id,
msg=msg,
messagetype=messagetype )
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_ADD,
library_item=folder ) or \
- ( replace_dataset and trans.app.security_agent.allow_action( trans.user,
+ ( replace_dataset and trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=replace_dataset ) ):
if params.get( 'new_dataset_button', False ):
@@ -769,7 +815,8 @@
# Since permissions on all LibraryDatasetDatasetAssociations must be the same at this point, we only need
# to check one of them to see if the current user can manage permissions on them.
check_ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_id_list[0] )
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
library_item=check_ldda ):
if replace_dataset:
@@ -892,7 +939,13 @@
# Since permissions on all LibraryDatasetDatasetAssociations must be the same at this point, we only need
# to check one of them to see if the current user can manage permissions on them.
check_ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_id_list[0] )
- if trans.app.security_agent.allow_action( trans.user,
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
library_item=check_ldda ):
if replace_dataset:
@@ -957,6 +1010,11 @@
id=library_id,
msg=util.sanitize_text( msg ),
messagetype='error' ) )
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
if action == 'new':
if params.new == 'submitted':
new_folder = trans.app.model.LibraryFolder( name=util.restore_text( params.name ),
@@ -994,7 +1052,8 @@
else:
widgets = []
if params.get( 'rename_folder_button', False ):
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
library_item=folder ):
old_name = folder.name
@@ -1037,7 +1096,8 @@
elif action == 'permissions':
if params.get( 'update_roles_button', False ):
# The user clicked the Save button on the 'Associate With Roles' form
- if trans.app.security_agent.allow_action( trans.user,
+ if trans.app.security_agent.allow_action( user,
+ roles,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE,
library_item=folder ):
permissions = {}
@@ -1162,12 +1222,24 @@
msg=util.sanitize_text( msg ),
messagetype='done' ) )
-
-def get_authorized_libs(trans, user):
- all_libraries = trans.app.model.Library.filter(trans.app.model.Library.table.c.deleted == False).order_by(trans.app.model.Library.name).all()
+def get_authorized_libs( trans, user ):
+ # TODO: this is a mis-named function - the name should reflect the authorization policy
+ # If user is not authenticated, this method should not even be called. Also, it looks
+ # like all that is using this is the new request stuff, so it should be placed there.
+ if not user:
+ return []
+ roles = user.all_roles()
+ all_libraries = trans.app.model.Library.filter( trans.app.model.Library.table.c.deleted == False ) \
+ .order_by( trans.app.model.Library.name ).all()
authorized_libraries = []
for library in all_libraries:
- if trans.app.security_agent.allow_action(user, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library) \
- or trans.app.security_agent.allow_action(user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library):
- authorized_libraries.append(library)
- return authorized_libraries
\ No newline at end of file
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ trans.app.security_agent.permitted_actions.LIBRARY_ADD,
+ library_item=library ) \
+ or trans.app.security_agent.allow_action( user,
+ roles,
+ trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
+ library_item=library ):
+ authorized_libraries.append( library )
+ return authorized_libraries
diff -r 36f438ce1f82 -r cc4944a62b66 lib/galaxy/web/controllers/root.py
--- a/lib/galaxy/web/controllers/root.py Fri Aug 28 15:59:16 2009 -0400
+++ b/lib/galaxy/web/controllers/root.py Fri Aug 28 16:52:58 2009 -0400
@@ -152,7 +152,15 @@
except:
return "Dataset id '%s' is invalid" %str( id )
if data:
- if trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset = data ):
+ user = trans.user
+ if user:
+ roles = user.all_roles
+ else:
+ roles = None
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ data.permitted_actions.DATASET_ACCESS,
+ dataset = data.dataset ):
mime = trans.app.datatypes_registry.get_mimetype_by_extension( data.extension.lower() )
trans.response.set_content_type(mime)
if tofile:
@@ -184,7 +192,15 @@
if data:
child = data.get_child_by_designation( designation )
if child:
- if trans.app.security_agent.allow_action( trans.user, child.permitted_actions.DATASET_ACCESS, dataset = child ):
+ user = trans.user
+ if user:
+ roles = user.all_roles
+ else:
+ roles = None
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ child.permitted_actions.DATASET_ACCESS,
+ dataset = child ):
return self.display( trans, id=child.id, tofile=tofile, toext=toext )
else:
return "You are not privileged to access this dataset."
@@ -200,11 +216,21 @@
if 'authz_method' in kwd:
authz_method = kwd['authz_method']
if data:
- if authz_method == 'rbac' and trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset = data ):
+ user = trans.user
+ if user:
+ roles = user.all_roles
+ else:
+ roles = None
+ if authz_method == 'rbac' and trans.app.security_agent.allow_action( user,
+ roles,
+ data.permitted_actions.DATASET_ACCESS,
+ dataset = data ):
trans.response.set_content_type( data.get_mime() )
trans.log_event( "Formatted dataset id %s for display at %s" % ( str( id ), display_app ) )
return data.as_display_type( display_app, **kwd )
- elif authz_method == 'display_at' and trans.app.host_security_agent.allow_action( trans.request.remote_addr, data.permitted_actions.DATASET_ACCESS, dataset = data ):
+ elif authz_method == 'display_at' and trans.app.host_security_agent.allow_action( trans.request.remote_addr,
+ data.permitted_actions.DATASET_ACCESS,
+ dataset = data ):
trans.response.set_content_type( data.get_mime() )
return data.as_display_type( display_app, **kwd )
else:
@@ -247,7 +273,15 @@
return trans.show_error_message( "Problem retrieving dataset." )
if id is not None and data.history.user is not None and data.history.user != trans.user:
return trans.show_error_message( "This instance of a dataset (%s) in a history does not belong to you." % ( data.id ) )
- if trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset=data ):
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ data.permitted_actions.DATASET_ACCESS,
+ dataset=data.dataset ):
if data.state == trans.model.Dataset.states.UPLOAD:
return trans.show_error_message( "Please wait until this dataset finishes uploading before attempting to edit its metadata." )
params = util.Params( kwd, safe=False )
@@ -313,7 +347,10 @@
elif params.update_roles_button:
if not trans.user:
return trans.show_error_message( "You must be logged in if you want to change permissions." )
- if trans.app.security_agent.allow_action( trans.user, data.dataset.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset = data.dataset ):
+ if trans.app.security_agent.allow_action( user,
+ roles,
+ data.dataset.permitted_actions.DATASET_MANAGE_PERMISSIONS,
+ dataset = data.dataset ):
permissions = {}
for k, v in trans.app.model.Dataset.permitted_actions.items():
in_roles = params.get( k + '_in', [] )
diff -r 36f438ce1f82 -r cc4944a62b66 templates/admin/library/browse_library.mako
--- a/templates/admin/library/browse_library.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/admin/library/browse_library.mako Fri Aug 28 16:52:58 2009 -0400
@@ -138,12 +138,12 @@
%if show_deleted:
<%
parent_folders = folder.activatable_folders
- parent_datasets = folder.activatable_datasets
+ parent_datasets = folder.activatable_library_datasets
%>
%else:
<%
parent_folders = folder.active_folders
- parent_datasets = folder.active_datasets
+ parent_datasets = folder.active_library_datasets
%>
%endif
%for folder in name_sorted( parent_folders ):
diff -r 36f438ce1f82 -r cc4944a62b66 templates/dataset/edit_attributes.mako
--- a/templates/dataset/edit_attributes.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/dataset/edit_attributes.mako Fri Aug 28 16:52:58 2009 -0400
@@ -3,6 +3,13 @@
<%def name="title()">${_('Edit Dataset Attributes')}</%def>
+<%
+ user = trans.user
+ if user:
+ user_roles = user.all_roles()
+ else:
+ user_roles = None
+%>
<%def name="datatype( dataset, datatypes )">
<select name="datatype">
@@ -134,9 +141,9 @@
</div>
<p />
-%if trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset = data ):
+%if trans.app.security_agent.allow_action( user, user_roles, data.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset=data.dataset ):
<%namespace file="/dataset/security_common.mako" import="render_permission_form" />
- ${render_permission_form( data.dataset, data.name, h.url_for( controller='root', action='edit', id=data.id ), trans.user.all_roles() )}
+ ${render_permission_form( data.dataset, data.name, h.url_for( controller='root', action='edit', id=data.id ), user_roles )}
%elif trans.user:
<div class="toolForm">
<div class="toolFormTitle">View Permissions</div>
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/browse_library.mako
--- a/templates/library/browse_library.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/browse_library.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,6 +1,15 @@
<%inherit file="/base.mako"/>
<%namespace file="/message.mako" import="render_msg" />
-<% from galaxy import util %>
+<%
+ from galaxy import util
+ from time import strftime
+
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+%>
<%def name="title()">Browse data library</%def>
<%def name="stylesheets()">
@@ -110,14 +119,14 @@
<a href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, info=True )}"><b>${ldda.name[:60]}</b></a>
<a id="dataset-${ldda.id}-popup" class="popup-arrow" style="display: none;">▼</a>
<div popupmenu="dataset-${ldda.id}-popup">
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, edit_info=True )}">Edit this dataset's information</a>
%else:
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, information=True )}">View this dataset's information</a>
%endif
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset=ldda.dataset ) and trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=ldda.library_dataset ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset=ldda.dataset ) and trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=ldda.library_dataset ):
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, permissions=True )}">Edit this dataset's permissions</a>
- %if current_version and trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
+ %if current_version and trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, replace_id=library_dataset.id )}">Upload a new version of this dataset</a>
%endif
%endif
@@ -126,24 +135,26 @@
<a class="action-button" href="${h.url_for( controller='library', action='download_dataset_from_folder', id=ldda.id, library_id=library.id )}">Download this dataset</a>
%endif
</div>
-
</td>
<td>${ldda.message}</td>
<td>${uploaded_by}</td>
<td>${ldda.create_time.strftime( "%Y-%m-%d" )}</td>
- </tr>
-
+ </tr>
<%
my_row = row_counter.count
row_counter.increment()
%>
</%def>
-
<%def name="render_folder( folder, folder_pad, created_ldda_ids, library_id, parent=None, row_counter=None )">
<%
def show_folder():
- if trans.app.security_agent.check_folder_contents( trans.user, folder ) or trans.app.security_agent.show_library_item( trans.user, folder ):
+ ## TODO: instead of calling check_folder_contents(), which we've already done prior to getting here,
+ ## add a new method that will itself call check_folder_contents() and build a list of accessible folders
+ ## for each library - this should improve performance dor large libraries where the current user can only
+ ## access a small number of folders.
+ if trans.app.security_agent.check_folder_contents( user, roles, folder ) or \
+ trans.app.security_agent.show_library_item( user, roles, folder ):
return True
return False
if not show_folder:
@@ -179,21 +190,21 @@
%endif
<a id="folder_img-${folder.id}-popup" class="popup-arrow" style="display: none;">▼</a>
<div popupmenu="folder_img-${folder.id}-popup">
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=folder ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=folder ):
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library_id, folder_id=folder.id )}">Add datasets to this folder</a>
<a class="action-button" href="${h.url_for( controller='library', action='folder', new=True, id=folder.id, library_id=library_id )}">Create a new sub-folder in this folder</a>
%endif
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=folder ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=folder ):
<a class="action-button" href="${h.url_for( controller='library', action='folder', information=True, id=folder.id, library_id=library_id )}">Edit this folder's information</a>
%else:
<a class="action-button" href="${h.url_for( controller='library', action='folder', information=True, id=folder.id, library_id=library_id )}">View this folder's information</a>
%endif
%if forms and not folder.info_association:
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library ):
<a class="action-button" href="${h.url_for( controller='library', action='info_template', library_id=library.id, add=True )}">Add an information template to this folder</a>
%endif
%endif
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=folder ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=folder ):
<a class="action-button" href="${h.url_for( controller='library', action='folder', permissions=True, id=folder.id, library_id=library_id )}">Edit this folder's permissions</a>
%endif
</div>
@@ -208,11 +219,11 @@
%for child_folder in name_sorted( folder.active_folders ):
${render_folder( child_folder, pad, created_ldda_ids, library_id, my_row, row_counter )}
%endfor
- %for library_dataset in name_sorted( folder.active_datasets ):
+ %for library_dataset in name_sorted( folder.active_library_datasets ):
<%
selected = created_ldda_ids and library_dataset.library_dataset_dataset_association.id in created_ldda_ids
%>
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.DATASET_ACCESS, dataset=library_dataset.library_dataset_dataset_association.dataset ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.DATASET_ACCESS, dataset=library_dataset.library_dataset_dataset_association.dataset ):
${render_dataset( library_dataset, selected, library, pad, my_row, row_counter )}
%endif
%endfor
@@ -221,7 +232,7 @@
<h2>Data Library “${library.name}”</h2>
<ul class="manage-table-actions">
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library ):
%if not deleted:
<li>
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library.root_folder.id )}"><span>Add datasets to this library</span></a>
@@ -231,17 +242,17 @@
</li>
%endif
%endif
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library ):
<li><a class="action-button" href="${h.url_for( controller='library', action='library', information=True, id=library.id )}">Edit this library's information</a></li>
%else:
<li><a class="action-button" href="${h.url_for( controller='library', action='library', information=True, id=library.id )}">View this library's information</a></li>
%endif
%if forms and not library.info_association:
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=library ):
<a class="action-button" href="${h.url_for( controller='library', action='info_template', library_id=library.id, add=True )}">Add an information template to this library</a>
%endif
%endif
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=library ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=library ):
<li><a class="action-button" href="${h.url_for( controller='library', action='library', permissions=True, id=library.id )}">Edit this library's permissions</a></li>
%endif
</ul>
@@ -265,7 +276,7 @@
</thead>
</tr>
<% row_counter = RowCounter() %>
- ${render_folder( library.root_folder, 0, created_ldda_ids, library.id, Nonw, row_counter )}
+ ${render_folder( library.root_folder, 0, created_ldda_ids, library.id, None, row_counter )}
<tfoot>
<tr>
<td colspan="4" style="padding-left: 42px;">
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/common.mako
--- a/templates/library/common.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/common.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,65 +1,3 @@
-<% from time import strftime %>
-
-<%def name="render_dataset( library_dataset, selected, library )">
- <%
- ## The received data must always be a LibraryDataset object, but the object id passed to methods from the drop down menu
- ## should be the underlying ldda id to prevent id collision ( which could happen when displaying children, which are always
- ## lddas ). We also need to make sure we're displaying the latest version of this library_dataset, so we display the attributes
- ## from the ldda.
- ldda = library_dataset.library_dataset_dataset_association
- if ldda.user:
- uploaded_by = ldda.user.email
- else:
- uploaded_by = 'anonymous'
- if ldda == ldda.library_dataset.library_dataset_dataset_association:
- current_version = True
- else:
- current_version = False
- %>
- <div class="historyItemWrapper historyItem historyItem-${ldda.state}" id="libraryItem-${ldda.id}">
- ## Header row for library items (name, state, action buttons)
- <div class="historyItemTitleBar">
- <table cellspacing="0" cellpadding="0" border="0" width="100%">
- <tr>
- <td width="*">
- %if selected:
- <input type="checkbox" name="ldda_ids" value="${ldda.id}" checked/>
- %else:
- <input type="checkbox" name="ldda_ids" value="${ldda.id}"/>
- %endif
- <a href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, info=True )}"><b>${ldda.name[:60]}</b></a>
- <a id="dataset-${ldda.id}-popup" class="popup-arrow" style="display: none;">▼</a>
- <div popupmenu="dataset-${ldda.id}-popup">
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
- <a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, edit_info=True )}">Edit this dataset's information</a>
- %else:
- <a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, information=True )}">View this dataset's information</a>
- %endif
- ## We're disabling the ability to add templates at the LDDA and LibraryDataset level, but will leave this here for possible future use
- ##%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_ADD, library_item=ldda.library_dataset ):
- ## <a class="action-button" href="${h.url_for( controller='library', action='info_template', library_id=library.id, library_dataset_id=library_dataset.id, new_template=True )}">Add an information template to this dataset</a>
- ##%endif
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset=ldda.dataset ) and trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=ldda.library_dataset ):
- <a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, id=ldda.id, permissions=True )}">Edit this dataset's permissions</a>
- %if current_version and trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
- <a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library.id, folder_id=library_dataset.folder.id, replace_id=library_dataset.id )}">Upload a new version of this dataset</a>
- %endif
- %endif
- %if ldda.has_data:
- <a class="action-button" href="${h.url_for( controller='library', action='datasets', library_id=library.id, ldda_ids=str( ldda.id ), do_action='add' )}">Import this dataset into your current history</a>
- <a class="action-button" href="${h.url_for( controller='library', action='download_dataset_from_folder', id=ldda.id, library_id=library.id )}">Download this dataset</a>
- %endif
- </div>
- </td>
- <td width="500">${ldda.message}</td>
- <td width="150">${uploaded_by}</td>
- <td width="60">${ldda.create_time.strftime( "%Y-%m-%d" )}</td>
- </tr>
- </table>
- </div>
- </div>
-</%def>
-
<%def name="render_template_info( library_item, library_id, widgets, editable=True )">
<%
library_item_type = 'unknown type'
@@ -76,13 +14,18 @@
elif isinstance( library_item, trans.app.model.LibraryDatasetDatasetAssociation ):
library_item_type = 'library_dataset_dataset_association'
library_item_desc = 'library dataset'
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
%>
%if widgets:
<p/>
<div class="toolForm">
<div class="toolFormTitle">Other information about ${library_item_desc} ${library_item.name}</div>
<div class="toolFormBody">
- %if editable and trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library_item ):
+ %if editable and trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library_item ):
<form name="edit_info" action="${h.url_for( controller='library', action='edit_template_info', library_id=library_id, num_widgets=len( widgets ) )}" method="post">
<input type="hidden" name="library_item_id" value="${library_item.id}"/>
<input type="hidden" name="library_item_type" value="${library_item_type}"/>
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/folder_info.mako
--- a/templates/library/folder_info.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/folder_info.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,6 +1,14 @@
<%inherit file="/base.mako"/>
<%namespace file="/message.mako" import="render_msg" />
<%namespace file="/library/common.mako" import="render_template_info" />
+
+<%
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+%>
<br/><br/>
<ul class="manage-table-actions">
@@ -16,7 +24,7 @@
<div class="toolForm">
<div class="toolFormTitle">Edit folder name and description</div>
<div class="toolFormBody">
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=folder ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=folder ):
<form name="folder" action="${h.url_for( controller='library', action='folder', rename=True, id=folder.id, library_id=library_id )}" method="post" >
<div class="form-row">
<label>Name:</label>
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/folder_permissions.mako
--- a/templates/library/folder_permissions.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/folder_permissions.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,6 +1,14 @@
<%inherit file="/base.mako"/>
<%namespace file="/message.mako" import="render_msg" />
<%namespace file="/dataset/security_common.mako" import="render_permission_form" />
+
+<%
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+%>
<br/><br/>
<ul class="manage-table-actions">
@@ -13,6 +21,6 @@
${render_msg( msg, messagetype )}
%endif
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=folder ):
+%if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=folder ):
${render_permission_form( folder, folder.name, h.url_for( controller='library', action='folder', id=folder.id, library_id=library_id, permissions=True ), trans.user.all_roles() )}
%endif
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/ldda_edit_info.mako
--- a/templates/library/ldda_edit_info.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/ldda_edit_info.mako Fri Aug 28 16:52:58 2009 -0400
@@ -2,6 +2,14 @@
<%namespace file="/message.mako" import="render_msg" />
<%namespace file="/library/common.mako" import="render_template_info" />
<% from galaxy import util %>
+
+<%
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+%>
%if ldda == ldda.library_dataset.library_dataset_dataset_association:
<b><i>This is the latest version of this library dataset</i></b>
@@ -32,7 +40,7 @@
</select>
</%def>
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
+%if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
<div class="toolForm">
<div class="toolFormTitle">Edit attributes of ${ldda.name}</div>
<div class="toolFormBody">
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/ldda_info.mako
--- a/templates/library/ldda_info.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/ldda_info.mako Fri Aug 28 16:52:58 2009 -0400
@@ -8,6 +8,11 @@
current_version = True
else:
current_version = False
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
%>
%if current_version:
@@ -39,15 +44,15 @@
Information about ${ldda.name}
<a id="dataset-${ldda.id}-popup" class="popup-arrow" style="display: none;">▼</a>
<div popupmenu="dataset-${ldda.id}-popup">
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library_id, folder_id=ldda.library_dataset.folder.id, id=ldda.id, edit_info=True )}">Edit this dataset's information</a>
%else:
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library_id, folder_id=ldda.library_dataset.folder.id, id=ldda.id, information=True )}">View this dataset's information</a>
%endif
- %if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset=ldda.dataset ) and trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=ldda.library_dataset ):
+ %if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.DATASET_MANAGE_PERMISSIONS, dataset=ldda.dataset ) and trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=ldda.library_dataset ):
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library_id, folder_id=ldda.library_dataset.folder.id, id=ldda.id, permissions=True )}">Edit this dataset's permissions</a>
%endif
- %if current_version and trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
+ %if current_version and trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=ldda.library_dataset ):
<a class="action-button" href="${h.url_for( controller='library', action='library_dataset_dataset_association', library_id=library_id, folder_id=ldda.library_dataset.folder.id, replace_id=ldda.library_dataset.id )}">Upload a new version of this dataset</a>
%endif
%if ldda.has_data:
@@ -92,6 +97,8 @@
<div><pre id="peek${ldda.id}" class="peek">${ldda.display_peek()}</pre></div>
%endif
## Recurse for child datasets
+ ## TODO: eliminate this - child datasets are deprecated, and where does
+ ## render_dataset() come from anyway - it's not imported!
%if len( ldda.visible_children ) > 0:
<div>
There are ${len( ldda.visible_children )} secondary datasets.
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/library_dataset_info.mako
--- a/templates/library/library_dataset_info.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/library_dataset_info.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,6 +1,14 @@
<%inherit file="/base.mako"/>
<%namespace file="/message.mako" import="render_msg" />
<%namespace file="/library/common.mako" import="render_template_info" />
+
+<%
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+%>
%if library_dataset == library_dataset.library_dataset_dataset_association.library_dataset:
<b><i>This is the latest version of this library dataset</i></b>
@@ -19,7 +27,7 @@
${render_msg( msg, messagetype )}
%endif
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library_dataset ):
+%if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library_dataset ):
<div class="toolForm">
<div class="toolFormTitle">Edit attributes of ${library_dataset.name}</div>
<div class="toolFormBody">
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/library_dataset_permissions.mako
--- a/templates/library/library_dataset_permissions.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/library_dataset_permissions.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,6 +1,14 @@
<%inherit file="/base.mako"/>
<%namespace file="/message.mako" import="render_msg" />
<%namespace file="/dataset/security_common.mako" import="render_permission_form" />>
+
+<%
+ user = trans.user
+ if user:
+ user_roles = user.all_roles()
+ else:
+ user_roles = None
+%>
%if library_dataset == library_dataset.library_dataset_dataset_association.library_dataset:
<b><i>This is the latest version of this library dataset</i></b>
@@ -19,7 +27,7 @@
${render_msg( msg, messagetype )}
%endif
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_manage, library_item=library_dataset ):
+%if trans.app.security_agent.allow_action( user, user_roles, trans.app.security_agent.permitted_actions.LIBRARY_manage, library_item=library_dataset ):
<%
roles = trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.table.c.name ).all()
%>
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/library_info.mako
--- a/templates/library/library_info.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/library_info.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,6 +1,14 @@
<%inherit file="/base.mako"/>
<%namespace file="/message.mako" import="render_msg" />
<%namespace file="/library/common.mako" import="render_template_info" />
+
+<%
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+%>
<br/><br/>
<ul class="manage-table-actions">
@@ -13,7 +21,7 @@
${render_msg( msg, messagetype )}
%endif
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library ):
+%if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library ):
<div class="toolForm">
<div class="toolFormTitle">Change library name and description</div>
<div class="toolFormBody">
diff -r 36f438ce1f82 -r cc4944a62b66 templates/library/library_permissions.mako
--- a/templates/library/library_permissions.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/library/library_permissions.mako Fri Aug 28 16:52:58 2009 -0400
@@ -1,6 +1,14 @@
<%inherit file="/base.mako"/>
<%namespace file="/message.mako" import="render_msg" />
<%namespace file="/dataset/security_common.mako" import="render_permission_form" />
+
+<%
+ user = trans.user
+ if user:
+ user_roles = user.all_roles()
+ else:
+ user_roles = None
+%>
<br/><br/>
<ul class="manage-table-actions">
@@ -13,7 +21,7 @@
${render_msg( msg, messagetype )}
%endif
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=library ):
+%if trans.app.security_agent.allow_action( user, user_roles, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=library ):
<%
roles = trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.table.c.name ).all()
%>
diff -r 36f438ce1f82 -r cc4944a62b66 templates/mobile/history/detail.mako
--- a/templates/mobile/history/detail.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/mobile/history/detail.mako Fri Aug 28 16:52:58 2009 -0400
@@ -36,8 +36,14 @@
<div class="secondary">
## Body for history items, extra info and actions, data "peek"
-
- %if not trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset = data.dataset ):
+ <%
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+ %>
+ %if not trans.app.security_agent.allow_action( user, roles, data.permitted_actions.DATASET_ACCESS, dataset = data.dataset ):
<div>You do not have permission to view this dataset.</div>
%elif data_state == "queued":
<div>Job is waiting to run</div>
diff -r 36f438ce1f82 -r cc4944a62b66 templates/mobile/manage_library.mako
--- a/templates/mobile/manage_library.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/mobile/manage_library.mako Fri Aug 28 16:52:58 2009 -0400
@@ -3,11 +3,19 @@
<%namespace file="/dataset/security_common.mako" import="render_permission_form" />
<%namespace file="/library/common.mako" import="render_template_info" />
+<%
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
+%>
+
%if msg:
${render_msg( msg, messagetype )}
%endif
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library ):
+%if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MODIFY, library_item=library ):
<div class="toolForm">
<div class="toolFormTitle">Change library name and description</div>
<div class="toolFormBody">
@@ -53,7 +61,7 @@
</div>
</div>
%endif
-%if trans.app.security_agent.allow_action( trans.user, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=library ):
+%if trans.app.security_agent.allow_action( user, roles, trans.app.security_agent.permitted_actions.LIBRARY_MANAGE, library_item=library ):
<%
roles = trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.table.c.name ).all()
%>
diff -r 36f438ce1f82 -r cc4944a62b66 templates/root/history_common.mako
--- a/templates/root/history_common.mako Fri Aug 28 15:59:16 2009 -0400
+++ b/templates/root/history_common.mako Fri Aug 28 16:52:58 2009 -0400
@@ -2,12 +2,17 @@
## Render the dataset `data` as history item, using `hid` as the displayed id
<%def name="render_dataset( data, hid, show_deleted_on_refresh = False )">
<%
- if data.state in ['no state','',None]:
- data_state = "queued"
- else:
- data_state = data.state
+ if data.state in ['no state','',None]:
+ data_state = "queued"
+ else:
+ data_state = data.state
+ user = trans.user
+ if user:
+ roles = user.all_roles()
+ else:
+ roles = None
%>
- %if not trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset = data.dataset ):
+ %if not trans.app.security_agent.allow_action( user, roles, data.permitted_actions.DATASET_ACCESS, dataset = data.dataset ):
<div class="historyItemWrapper historyItem historyItem-${data_state} historyItem-noPermission" id="historyItem-${data.id}">
%else:
<div class="historyItemWrapper historyItem historyItem-${data_state}" id="historyItem-${data.id}">
@@ -41,7 +46,7 @@
## Body for history items, extra info and actions, data "peek"
<div id="info${data.id}" class="historyItemBody">
- %if not trans.app.security_agent.allow_action( trans.user, data.permitted_actions.DATASET_ACCESS, dataset = data.dataset ):
+ %if not trans.app.security_agent.allow_action( user, roles, data.permitted_actions.DATASET_ACCESS, dataset = data.dataset ):
<div>You do not have permission to view this dataset.</div>
%elif data_state == "upload":
<div>Dataset is uploading</div>
diff -r 36f438ce1f82 -r cc4944a62b66 test/base/twilltestcase.py
--- a/test/base/twilltestcase.py Fri Aug 28 15:59:16 2009 -0400
+++ b/test/base/twilltestcase.py Fri Aug 28 16:52:58 2009 -0400
@@ -1229,7 +1229,7 @@
tc.fv( '1', 'new_element_description_1', ele_help_1.replace( '+', ' ' ) )
tc.submit( 'new_info_template_button' )
self.home()
- def add_folder( self, library_id, folder_id, name='Folder One', description='NThis is Folder One' ):
+ def add_folder( self, library_id, folder_id, name='Folder One', description='This is Folder One' ):
"""Create a new folder"""
self.home()
self.visit_url( "%s/admin/folder?library_id=%s&id=%s&new=True" % ( self.url, library_id, folder_id ) )
diff -r 36f438ce1f82 -r cc4944a62b66 universe_wsgi.ini.sample
--- a/universe_wsgi.ini.sample Fri Aug 28 15:59:16 2009 -0400
+++ b/universe_wsgi.ini.sample Fri Aug 28 16:52:58 2009 -0400
@@ -25,6 +25,11 @@
#database_engine_option_echo_pool = true
#database_engine_option_pool_size = 10
#database_engine_option_max_overflow = 20
+
+# If using MySQL, see:
+# http://rapd.wordpress.com/2008/03/02/sqlalchemy-sqlerror-operationalerror-2…
+# To handle this issue, try the following setting:
+#database_engine_option_pool_recycle = 7200
# Where dataset files are saved
file_path = database/files
1
0

29 Aug '09
details: http://www.bx.psu.edu/hg/galaxy/rev/6b66a5e142ab
changeset: 2641:6b66a5e142ab
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Fri Aug 28 15:27:01 2009 -0400
description:
web controller to handle tagging (add/remove/autocomplete)
1 file(s) affected in this change:
lib/galaxy/web/controllers/tag.py
diffs (151 lines):
diff -r b85cb0a65f46 -r 6b66a5e142ab lib/galaxy/web/controllers/tag.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/web/controllers/tag.py Fri Aug 28 15:27:01 2009 -0400
@@ -0,0 +1,147 @@
+"""
+Tags Controller: handles tagging/untagging of entities and provides autocomplete support.
+"""
+
+from galaxy.web.base.controller import *
+from galaxy.tags.tag_handler import *
+from sqlalchemy.sql.expression import func, and_
+from sqlalchemy.sql import select
+
+class TagsController ( BaseController ):
+
+ def __init__(self, app):
+ BaseController.__init__(self, app)
+
+ # Set up dict for mapping from short-hand to full item class.
+ self.shorthand_to_item_class_dict = dict()
+ self.shorthand_to_item_class_dict["history"] = History
+ self.shorthand_to_item_class_dict["hda"] = HistoryDatasetAssociation
+
+ # Set up tag handler to recognize the following items: History, HistoryDatasetAssociation, ...
+ self.tag_handler = TagHandler()
+ self.tag_handler.add_tag_assoc_class(History, HistoryTagAssociation)
+ self.tag_handler.add_tag_assoc_class(HistoryDatasetAssociation, HistoryDatasetAssociationTagAssociation)
+
+ @web.expose
+ def add_tag_async( self, trans, id=None, item_type=None, new_tag=None ):
+ """ Add tag to an item. """
+ item = self._get_item(trans, item_type, trans.security.decode_id(id))
+
+ self._do_security_check(trans, item)
+
+ self.tag_handler.apply_item_tags(trans.sa_session, item, new_tag)
+ trans.sa_session.flush()
+
+ @web.expose
+ def remove_tag_async( self, trans, id=None, item_type=None, tag_name=None ):
+ """ Remove tag from an item. """
+ item = self._get_item(trans, item_type, trans.security.decode_id(id))
+
+ self._do_security_check(trans, item)
+
+ self.tag_handler.remove_item_tag(item, tag_name)
+ trans.sa_session.flush()
+
+ # Retag an item. All previous tags are deleted and new tags are applied.
+ @web.expose
+ def retag_async( self, trans, id=None, item_type=None, new_tags=None ):
+ """ Apply a new set of tags to an item; previous tags are deleted. """
+ item = self._get_item(trans, item_type, trans.security.decode_id(id))
+
+ self._do_security_check(trans, item)
+
+ tag_handler.delete_item_tags(item)
+ self.tag_handler.apply_item_tags(trans.sa_session, item, new_tag)
+ trans.sa_session.flush()
+
+ tag_handler.delete_item_tags(history)
+ tag_handler.apply_item_tags(trans.sa_session, history, new_tags)
+ # Flush to complete changes.
+ trans.sa_session.flush()
+
+ @web.expose
+ @web.require_login( "get autocomplete data for an item's tags" )
+ def tag_autocomplete_data(self, trans, id=None, item_type=None, q=None, limit=None, timestamp=None):
+ """ Get autocomplete data for an item's tags. """
+ item = self._get_item(trans, item_type, trans.security.decode_id(id))
+
+ self._do_security_check(trans, item)
+
+ #
+ # Get user's item tags and usage counts.
+ #
+
+ # Get item-tag association class.
+ item_tag_assoc_class = self.tag_handler.get_tag_assoc_class(item.__class__)
+
+ # Build select statement.
+ cols_to_select = [ item_tag_assoc_class.table.c.tag_id, item_tag_assoc_class.table.c.user_tname, item_tag_assoc_class.table.c.user_value, func.count('*') ]
+ from_obj = item_tag_assoc_class.table.join(item.table).join(Tag)
+ where_clause = self._get_column_for_filtering_item_by_user_id(item)==trans.get_user().id
+ order_by = [ func.count("*").desc() ]
+ ac_for_names = not q.endswith(":")
+ if ac_for_names:
+ # Autocomplete for tag names.
+ where_clause = and_(where_clause, Tag.table.c.name.like(q + "%"))
+ group_by = item_tag_assoc_class.table.c.tag_id
+ else:
+ # Autocomplete for tag values.
+ tag_name_and_value = q.split(":")
+ tag_name = tag_name_and_value[0]
+ tag_value = tag_name_and_value[1]
+ where_clause = and_(where_clause, Tag.table.c.name==tag_name)
+ where_clause = and_(where_clause, item_tag_assoc_class.table.c.value.like(tag_value + "%"))
+ group_by = item_tag_assoc_class.table.c.value
+
+ # Do query and get result set.
+ query = select(columns=cols_to_select, from_obj=from_obj,
+ whereclause=where_clause, group_by=group_by, order_by=order_by)
+ result_set = trans.sa_session.execute(query)
+
+ # Create and return autocomplete data.
+ if ac_for_names:
+ # Autocomplete for tag names.
+ ac_data = "#Header|Your Tags\n"
+ for row in result_set:
+ # Exclude tags that are already applied to the history.
+ if self.tag_handler.item_has_tag(item, row[1]):
+ continue
+ # Add tag to autocomplete data.
+ ac_data += row[1] + "|" + row[1] + "\n"
+ else:
+ # Autocomplete for tag values.
+ ac_data = "#Header|Your Values for '%s'\n" % (tag_name)
+ for row in result_set:
+ ac_data += tag_name + ":" + row[2] + "|" + row[2] + "\n"
+
+ return ac_data
+
+ def _get_column_for_filtering_item_by_user_id(self, item):
+ """ Returns the column to use when filtering by user id. """
+ if isinstance(item, History):
+ return item.table.c.user_id
+ elif isinstance(item, HistoryDatasetAssociation):
+ # Use the user_id associated with the HDA's history.
+ history = item.history
+ return history.table.c.user_id
+
+ def _get_item(self, trans, item_type, id):
+ """ Get an item based on type and id. """
+ item_class = self.shorthand_to_item_class_dict[item_type]
+ item = trans.sa_session.query(item_class).filter("id=" + str(id))[0]
+ return item;
+
+ def _do_security_check(self, trans, item):
+ """ Do security check on an item. """
+ if isinstance(item, History):
+ history = item;
+ # Check that the history exists, and is either owned by the current
+ # user (if logged in) or the current history
+ assert history is not None
+ if history.user is None:
+ assert history == trans.get_history()
+ else:
+ assert history.user == trans.user
+ elif isinstance(item, HistoryDatasetAssociation):
+ # TODO.
+ pass
1
0

29 Aug '09
details: http://www.bx.psu.edu/hg/galaxy/rev/36f438ce1f82
changeset: 2639:36f438ce1f82
user: Kelly Vincent <kpvincent(a)bx.psu.edu>
date: Fri Aug 28 15:59:16 2009 -0400
description:
Added samtools-based tools (sam_to_bam, sam_merge, sam_pileup) with their supporting files and modified Bam datatype so temp files are properly cleaned up
15 file(s) affected in this change:
lib/galaxy/datatypes/images.py
test-data/chrM.fa
test-data/sam_to_bam_in1.sam
test-data/sam_to_bam_in2.sam
test-data/sam_to_bam_out1.bam
test-data/sam_to_bam_out2.bam
tool-data/sam_fa_indices.loc.sample
tool_conf.xml.sample
tools/samtools/sam_merge.py
tools/samtools/sam_merge.xml
tools/samtools/sam_merge_code.py
tools/samtools/sam_pileup.py
tools/samtools/sam_pileup.xml
tools/samtools/sam_to_bam.py
tools/samtools/sam_to_bam.xml
diffs (843 lines):
diff -r d261f41a2a03 -r 36f438ce1f82 lib/galaxy/datatypes/images.py
--- a/lib/galaxy/datatypes/images.py Fri Aug 28 15:29:53 2009 -0400
+++ b/lib/galaxy/datatypes/images.py Fri Aug 28 15:59:16 2009 -0400
@@ -252,14 +252,17 @@
index_file = dataset.metadata.spec['bam_index'].param.new_file( dataset = dataset )
tmp_dir = tempfile.gettempdir()
tmpf1 = tempfile.NamedTemporaryFile(dir=tmp_dir)
+ tmpf1bai = '%s.bai' % tmpf1.name
try:
subprocess.check_call(['cd', tmp_dir], shell=True)
subprocess.check_call('cp %s %s' % (dataset.file_name, tmpf1.name), shell=True)
subprocess.check_call('samtools index %s' % tmpf1.name, shell=True)
- subprocess.check_call('cp %s.bai %s' % (tmpf1.name, index_file.file_name), shell=True)
+ subprocess.check_call('cp %s %s' % (tmpf1bai, index_file.file_name), shell=True)
except subprocess.CalledProcessError:
sys.stderr.write('There was a problem creating the index for the BAM file\n')
tmpf1.close()
+ if os.path.exists(tmpf1bai):
+ os.remove(tmpf1bai)
dataset.metadata.bam_index = index_file
def set_peek( self, dataset ):
if not dataset.dataset.purged:
diff -r d261f41a2a03 -r 36f438ce1f82 test-data/chrM.fa
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/chrM.fa Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,335 @@
+>chrM
+GTTAATGTAGCTTAATAATATAAAGCAAGGCACTGAAAATGCCTAGATGA
+GTATTCTTACTCCATAAACACATAGGCTTGGTCCTAGCCTTTTTATTAGT
+TATTAATAGAATTACACATGCAAGTATCCGCACCCCAGTGAGAATGCCCT
+CTAAATCACGTCTCTACGATTAAAAGGAGCAGGTATCAAGCACACTAGAA
+AGTAGCTCATAACACCTTGCTCAGCCACACCCCCACGGGACACAGCAGTG
+ATAAAAATTAAGCTATGAACGAAAGTTCGACTAAGTCATATTAAATAAGG
+GTTGGTAAATTTCGTGCCAGCCACCGCGGTCATACGATTAACCCAAATTA
+ATAAATCTCCGGCGTAAAGCGTGTCAAAGACTAATACCAAAATAAAGTTA
+AAACCCAGTTAAGCCGTAAAAAGCTACAACCAAAGTAAAATAGACTACGA
+AAGTGACTTTAATACCTCTGACTACACGATAGCTAAGACCCAAACTGGGA
+TTAGATACCCCACTATGCTTAGCCCTAAACTAAAATAGCTTACCACAACA
+AAGCTATTCGCCAGAGTACTACTAGCAACAGCCTAAAACTCAAAGGACTT
+GGCGGTGCTTTACATCCCTCTAGAGGAGCCTGTTCCATAATCGATAAACC
+CCGATAAACCCCACCATCCCTTGCTAATTCAGCCTATATACCGCCATCTT
+CAGCAAACCCTAAACAAGGTACCGAAGTAAGCACAAATATCCAACATAAA
+AACGTTAGGTCAAGGTGTAGCCCATGGGATGGAGAGAAATGGGCTACATT
+TTCTACCCTAAGAACAAGAACTTTAACCCGGACGAAAGTCTCCATGAAAC
+TGGAGACTAAAGGAGGATTTAGCAGTAAATTAAGAATAGAGAGCTTAATT
+GAATCAGGCCATGAAGCGCGCACACACCGCCCGTCACCCTCCTTAAATAT
+CACAAATCATAACATAACATAAAACCGTGACCCAAACATATGAAAGGAGA
+CAAGTCGTAACAAGGTAAGTATACCGGAAGGTGTACTTGGATAACCAAAG
+TGTAGCTTAAACAAAGCATCCAGCTTACACCTAGAAGATTTCACTCAAAA
+TGAACACTTTGAACTAAAGCTAGCCCAAACAATACCTAATTCAATTACCC
+TTAGTCACTTAACTAAAACATTCACCAAACCATTAAAGTATAGGAGATAG
+AAATTTTAACTTGGCGCTATAGAGAAAGTACCGTAAGGGAACGATGAAAG
+ATGCATTAAAAGTACTAAACAGCAAAGCTTACCCCTTTTACCTTTTGCAT
+AATGATTTAACTAGAATAAACTTAGCAAAGAGAACTTAAGCTAAGCACCC
+CGAAACCAGACGAGCTACCTATGAACAGTTACAAATGAACCAACTCATCT
+ATGTCGCAAAATAGTGAGAAGATTCGTAGGTAGAGGTGAAAAGCCCAACG
+AGCCTGGTGATAGCTGGTTGTCCAGAAACAGAATTTCAGTTCAAATTTAA
+ATTTACCTAAAAACTACTCAATTCTAATGTAAATTTAAATTATAGTCTAA
+AAAGGTACAGCTTTTTAGATACAGGTTACAACCTTCATTAGAGAGTAAGA
+ACAAGATAAACCCATAGTTGGCTTAAAAGCAGCCATCAATTAAGAAAGCG
+TTCAAGCTCAACGACACATCTATCTTAATCCCAACAATCAACCCAAACTA
+ACTCCTAATCTCATACTGGACTATTCTATCAACACATAGAAGCAATAATG
+TTAATATGAGTAACAAGAATTATTTCTCCTTGCATAAGCTTATATCAGAA
+CGAATACTCACTGATAGTTAACAACAAGATAGGGATAATCCAAAAACTAA
+TCATCTATTTAAACCATTGTTAACCCAACACAGGCATGCATCTATAAGGA
+AAGATTAAAAGAAGTAAAAGGAACTCGGCAAACACAAACCCCGCCTGTTT
+ACCAAAAACATCACCTCTAGCATTTCCAGTATTAGAGGCACTGCCTGCCC
+AGTGACATCTGTTtaaacggccgcggtatcctaaccgtgcaaaggtagca
+taatcacttgttccctaaatagggacttgtatgaatggccacacgagggt
+tttactgtctcttacttccaatcagtgaaattgaccttcccgtgaagagg
+cgggaatgactaaataagacgagaagaccctatggagcttTAATTAACTG
+ATTCACAAAAAACAACACACAAACCTTAACCTTCAGGGACAACAAAACTT
+TTGATTGAATCAGCAATTTCGGTTGGGGTGACCTCGGAGAACAAAACAAC
+CTCCGAGTGATTTAAATCCAGACTAACCAGTCAAAATATATAATCACTTA
+TTGATCCAAACCATTGATCAACGGAACAAGTTACCCTAGGGATAACAGCG
+CAATCCTATTCCAGAGTCCATATCGACAATTAGGGTTTACGACCTCGATG
+TTGGATCAAGACATCCTAATGGTGCAACCGCTATTAAGGGTTCGTTTGTT
+CAACGATTAAAGTCTTACGTGATCTGAGTTCAGACCGGAGTAATCCAGGT
+CGGTTTCTATCTATTCTATACTTTTCCCAGTACGAAAGGACAAGAAAAGT
+AGGGCCCACTTTACAAGAAGCGCCCTCAAACTAATAGATGACATAATCTA
+AATCTAACTAATTTATAACTTCTACCGCCCTAGAACAGGGCTCgttaggg
+tggcagagcccggaaattgcataaaacttaaacctttacactcagaggtt
+caactcctctccctaacaacaTGTTCATAATTAACGTCCTCCTCCTAATT
+GTCCCAATCTTGCTCGCCGTAGCATTCCTCACACTAGTTGAACGAAAAGT
+CTTAGGCTATATGCAACTTCGCAAAGGACCCAACATCGTAGGCCCCTATG
+GCCTACTACAACCTATTGCCGATGCCCTCAAACTATTTATCAAAGAGCCA
+CTACAACCACTAACATCATCGACATCCATATTCATCATCGCACCAATCCT
+AGCCCTAACCCTGGCCTTAACCATATGAATCCCTCTGCCCATACCATACC
+CACTAATCAACATAAACCTAGGAATTCTATTCATACTAGCCATGTCCAGC
+CTAGCTGTCTACTCAATCCTTTGATCAGGATGGGCCTCAAACTCAAAATA
+CGCCCTAATTGGAGCTCTACGAGCAGTAGCACAAACCATCTCATACGAAG
+TAACTCTAGCAATCATCCTACTCTCAGTCCTCCTAATAAGCGGATCATTC
+ACATTATCAACACTTATTATTACCCAAGAATACCTCTGATTAATCTTCCC
+ATCATGACCCTTAGCCATAATGTGATTCATCTCAACATTAGCCGAAACCA
+ACCGAGCTCCATTTGACCTAACAGAAGGAGAATCAGAACTCGTCTCTGGA
+TTCAACGTTGAATACGCAGCCGGCCCATTTGCTCTATTCTTCCTAGCAGA
+ATACGCAAACATCATCATGATAAACATCTTCACAACAACCCTATTTCTAG
+GAGCATTTCACAACCCCTACCTGCCAGAACTCTACTCAATTAATTTCACC
+ATTAAAGCTCTCCTTCTAACATGTTCCTTCCTATGAATCCGAGCATCCTA
+CCCACGATTCCGATATGACCAACTTATACACCTCCTATGAAAGAACTTCC
+TACCACTCACACTAGCCCTCTGCATATGACACGTCTCACTTCCAATCATA
+CTATCCAGCATCCCACCACAAACATAGGAAATATGTCTGACAAAAGAGTT
+ACTTTGATAGAGTAAAACATAGAGGCTCAAACCCTCTTATTTctagaact
+acaggaattgaacctgctcctgagaattcaaaatcctccgtgctaccgaa
+ttacaccatgtcctaCAAGTAAGGTCAGCTAAATAAGCTATCGGGCCCAT
+ACCCCGAAAATGTTGGATTACACCCTTCCCGTACTAATAAATCCCCTTAT
+CTTCACAACTATTCTAATAACAGTTCTTCTAGGAACTATAATCGTTATAA
+TAAGCTCACACTGACTAATAATCTGAATCGGATTTGAAATAAATCTACTA
+GCCATTATCCCTATCCTAATAAAAAAGTACAATCCCCGAACCATAGAAGC
+CTCCACCAAATATTTTCTAACCCAAGCCACCGCATCAATACTCCTCATAA
+TAGCGATCATCATTAACCTCATACACTCAGGCCAATGAACAATCACAAAA
+GTCTTCAACCCCACAGCGTCCATCATTATAACTTCAGCTCTCGCCATAAA
+ACTTGGACTCACACCATTCCACTTCTGAGTACCCGAAGTCACACAGGGCA
+TCTCATTAACATCAGGTCTCATCCTACTTACATGACAAAAACTAGCCCCA
+ATATCAATCCTATATCAAATCTCACCCTCAATTAACCTAAATATCTTATT
+AACTATAGCCGTACTGTCAATCCTAGTAGGAGGCTGAGGCGGTCTCAACC
+AAACCCAACTACGAAAAATCATAGCATACTCGTCAATCGCGCATATAGGA
+TGAATAACAGCTGTCCTAGTATATAACCCAACACTAACAATACTAAACAT
+ATTAATTTACATTATAATAACACTCACAATATTCATACTATTTATCCACA
+GCTCCTCTACTACAACACTATCACTCTCCCACACATGAAACAAAATACCT
+CTAACCACTACACTAATCTTAATTACCTTACTATCCATAGGAGGCCTCCC
+CCCACTATCAGGATTCATACCCAAATGAATAATCATTCAAGAGCTCACCA
+AAAATAGCAGCATCATCCTCCCCACACTAATAGCCATTATAGCACTACTC
+AACCTCTACTTCTACATACGACTAACCTATTCCACCTCACTGACCATATT
+CCCATCCACAAACAACATAAAAATAAAATGACAATTCGAAACCAAACGAA
+TTACTCTCTTACCCCCGTTAATTGTTATATCCTCCCTACTCCTCCCCCTA
+ACCCCCATACTATCAATTTTGGACTAGGAATTTAGGTTAACATCCCAGAC
+CAAGAGCCTTCAAAGCTCTAAGCAAGTGAATCCACTTAATTCCTGCATAC
+TAAGGACTGCGAGACTCTATCTCACATCAATTGAACGCAAATCAAACTCT
+TTTATTAAGCTAAGCCCTTACTAGATTGGTGGGCTACCATCCCACGAAAT
+TTTAGTTAACAGCTAAATACCCTAATCAACTGGCTTCAATCTACTTCTCC
+CGCCGCCTAGAAAAAAAGGCGGGAGAAGCCCCGGCAGAAATTGAAGCTGC
+TCCTTTGAATTTGCAATTCAATGTGAAAATTCACCACGGGACTTGATAAG
+AAGAGGATTCCAACCCCTGTCTTTAGATTTACAGTCTAATGCTTACTCAG
+CCATCTTACCTATGTTCATCAACCGCTGACTATTTTCAACTAACCACAAA
+GACATCGGCACTCTGTACCTCCTATTCGGCGCTTGAGCTGGAATAGTAGG
+AACTGCCCTAAGCCTCCTAATCCGTGCTGAATTAGGCCAACCTGGGACCC
+TACTAGGAGATGATCAGATCTACAATGTCATTGTAACCGCCCATGCATTC
+GTAATAATTTTCTTTATGGTCATACCCATTATAATCGGAGGATTCGGAAA
+CTGATTAGTCCCCCTGATAATTGGAGCACCTGATATAGCTTTCCCCCGAA
+TAAACAACATAAGCTTCTGATTACTTCCCCCATCATTCCTACTTCTTCTC
+GCTTCCTCAATAATTGAAGCAGGTGCCGGAACAGGCTGAACCGTATATCC
+TCCTCTAGCTGGAAATCTGGCGCATGCAGGAGCCTCTGTTGACTTAACCA
+TTTTCTCTCTCCACCTAGCTGGGGTGTCCTCGATTTTAGGTGCCATCAAC
+TTTATTACCACAATCATTAACATAAAACCACCAGCCCTATCCCAATATCA
+AACCCCCCTATTCGTTTGATCTGTCCTTATTACGGCAGTACTCCTTCTCC
+TAGCCCTCCCGGTCCTAGCAGCAGGCATTACCATGCTTCTCACAGACCGT
+AACCTGAACACTACTTTCTTCGACCCCGCAGGAGGAGGGGATCCAATCCT
+TTATCAACACCTATTCTGATTCTTCGGACACCCCGAAGTCTATATTCTTA
+TCCTACCAGGCTTCGGTATAATCTCACACATCGTCACATACTACTCAGGT
+AAAAAGGAACCTTTTGGCTACATGGGTATAGTGTGAGCTATAATATCCAT
+TGGCTTTCTAGGCTTCATCGTATGGGCTCACCACATGTTTACAGTAGGGA
+TAGACGTTGACACACGAGCATACTTCACATCAGCTACCATAATCATCGCT
+ATCCCTACTGGTGTAAAAGTATTCAGCTGACTAGCCACCCTGCACGGAGG
+AAATATCAAATGATCTCCAGCTATACTCTGAGCTCTAGGCTTCATCTTCT
+TATTCACAGTAGGAGGTCTAACAGGAATCGTCCTAGCTAACTCATCCCTA
+GATATTGTTCTCCACGATACTTATTATGTAGTAGCACATTTCCATTATGT
+CCTGTCTATAGGAGCAGTCTTCGCCATTATGGGGGGATTTGTACACTGAT
+TCCCTCTATTCTCAGGATACACACTCAACCAAACCTGAGCAAAAATCCAC
+TTTACAATTATATTCGTAGGGGTAAATATAACCTTCTTCCCACAACATTT
+CCTTGGCCTCTCAGGAATGCCACGACGCTATTCTGATTATCCAGACGCAT
+ATACAACATGAAATACCATCTCATCCATAGGATCTTTTATCTCACTTACA
+GCAGTGATACTAATAATTTTCATAATTTGAGAAGCGTTCGCATCCAAACG
+AGAAGTGTCTACAGTAGAATTAACCTCAACTAATCTGGAATGACTACACG
+GATGCCCCCCACCATACCACACATTTGAAGAACCCACCTACGTAAACCTA
+AAAtaagaaaggaaggaatcgaaccccctctaactggtttcaagccaata
+tcataaccactatgtctttctcCATCAATTGAGGTATTAGTAAAAATTAC
+ATGACTTTGTCAAAGTTAAATTATAGGTTAAACCCCTATATACCTCTATG
+GCCTACCCCTTCCAACTAGGATTCCAAGACGCAACATCCCCTATTATAGA
+AGAACTCCTACACTTCCACGACCACACACTAATAATCGTATTCCTAATTA
+GCTCTCTAGTATTATATATTATCTCATCAATACTAACAACTAAATTAACC
+CATACCAGCACCATAGATGCTCAAGAAGTAGAGACAATTTGAACGATTTT
+ACCAGCCATCATCCTTATTCTAATCGCCCTCCCATCCCTACGAATTCTAT
+ATATAATAGATGAAATCAATAATCCGTCCCTCACAGTCAAAACAATAGGC
+CACCAATGATACTGAAGCTACGAGTATACCGATTACGAAGACTTGACCTT
+TGACTCCTACATGATCCCCACATCAGACCTAAAACCAGGAGAATTACGTC
+TTCTAGAAGTCGACAATCGAGTGGTTCTCCCCATAGAAATAACCATCCGA
+ATGCTAATTTCATCCGAAGACGTCCTACACTCATGAGCTGTGCCCTCCCT
+AGGCCTAAAAACAGACGCTATCCCTGGGCGCCTAAATCAGACAACTCTCG
+TGGCCTCTCGACCAGGACTTTACTACGGTCAATGCTCAGAGATCTGCGGA
+TCAAACCACAGCTTTATACCAATTGTCCTTGAACTAGTTCCACTGAAACA
+CTTCGAAGAATGATCTGCATCAATATTATAAAGTCACTAAGAAGCTATTA
+TAGCATTAACCTTTTAAGTTAAAGATTGAGGGTTCAACCCCCTCCCTAGT
+GATATGCCACAGTTGGATACATCAACATGATTTATTAATATCGTCTCAAT
+AATCCTAACTCTATTTATTGTATTTCAACTAAAAATCTCAAAGCACTCCT
+ATCCGACACACCCAGAAGTAAAGACAACCAAAATAACAAAACACTCTGCC
+CCTTGAGAATCAAAATGAACGAAAATCTATTCGCCTCTTTCGCTACCCCA
+ACAATAGTAGGCCTCCCTATTGTAATTCTGATCATCATATTTCCCAGCAT
+CCTATTCCCCTCACCCAACCGACTAATCAACAATCGCCTAATCTCAATTC
+AACAATGGCTAGTCCAACTTACATCAAAACAAATAATAGCTATCCATAAC
+AGCAAAGGACAAACCTGAACTCTTATACTCATATCACTGATCCTATTCAT
+TGGCTCAACAAACTTATTAGGCCTACTACCTCACTCATTTACACCAACAA
+CACAACTATCAATAAACCTAGGCATAGCTATTCCCCTATGGGCAGGGACA
+GTATTCATAGGCTTTCGTCACAAAACAAAAGCAGCCCTAGCCCACTTTCT
+ACCTCAAGGGACGCCCATTTTCCTCATCCCCATACTAGTAATTATCGAGA
+CTATCAGCCTATTTATTCAACCTGTAGCCCTAGCCGTGCGGCTAACCGCT
+AACATTACCGCCGGACACCTCCTAATACACCTCATCGGAGGGGCAACACT
+AGCCCTCATAAGCATCAGCCCCTCAACAGCCCTTATTACGTTTATCATCC
+TAATTCTACTAACTATCCTCGAATTCGCAGTAGCTATAATCCAAGCCTAC
+GTATTCACTCTCCTGGTAAGCCTTTACTTACACGACAACACCTAATGACC
+CACCAAACCCACGCTTACCACATAGTAAACCCCAGCCCATGACCACTTAC
+AGGAGCCCTATCAGCCCTCCTGATAACATCAGGACTAGCCATGTGATTTC
+ACTTTAACTCAACCTTACTTCTAGCTATAGGGCTATTAACTAACATCCTT
+ACCATATATCAATGATGACGAGACATCATCCGAGAAAGCACATTCCAAGG
+CCATCACACATCAATCGTTCAAAAGGGACTCCGATATGGCATAATCCTTT
+TTATTATCTCAGAAGTCTTCTTCTTCTCTGGCTTCTTCTGAGCCTTTTAC
+CACTCAAGCCTAGCCCCCACACCCGAACTAGGCGGCTGCTGACCACCCAC
+AGGTATCCACCCCTTAAACCCCCTAGAAGTCCCCTTACTCAACACCTCAG
+TGCTCCTAGCATCTGGAGTCTCTATCACCTGAGCCCACCATAGCCTAATA
+GAAGGAAACCGTAAAAATATGCTCCAAGGCCTATTCATCACAATTTCACT
+AGGCGTATACTTCACCCTTCTCCAAGCCTCAGAATACTATGAAGCCTCAT
+TTACTATTTCAGATGGAGTATACGGATCAACATTTTTCGTAGCAACAGGG
+TTCCACGGACTACACGTAATTATCGGATCTACCTTCCTCATTGTATGTTT
+CCTACGCCAACTAAAATTCCACTTTACATCCAGCCACCACTTCGGATTCG
+AAGCAGCCGCTTGATACTGACACTTCGTCGACGTAGTCTGACTATTCTTG
+TACGTCTCTATTTATTGATGAGGATCCTATTCTTTTAGTATTGACCAGTA
+CAATTGACTTCCAATCAATCAGCTTCGGTATAACCCGAAAAAGAATAATA
+AACCTCATACTGACACTCCTCACTAACACATTACTAGCCTCGCTACTCGT
+ACTCATCGCATTCTGACTACCACAACTAAACATCTATGCAGAAAAAACCA
+GCCCATATGAATGCGGATTTGACCCTATAGGGTCAGCACGCCTCCCCTTC
+TCAATAAAATTTTTCTTAGTGGCCATTACATTTCTGCTATTCGACTTAGA
+AATTGCCCTCCTATTACCCCTTCCATGAGCATCCCAAACAACTAACCTAA
+ACACTATACTTATCATAGCACTAGTCCTAATCTCTCTTCTAGCCATCAGC
+CTAGCCTACGAATGAACCCAAAAAGGACTAGAATGAACTGAGTATGGTAA
+TTAGTTTAAACCAAAACAAATGATTTCGACTCATTAAACTATGATTAACT
+TCATAATTACCAACATGTCACTAGTCCATATTAATATCTTCCTAGCATTC
+ACAGTATCCCTCGTAGGCCTACTAATGTACCGATCCCACCTAATATCCTC
+ACTCCTATGCCTAGAAGGAATAATACTATCACTATTCGTCATAGCAACCA
+TAATAGTCCTAAACACCCACTTCACACTAGCTAGTATAATACCTATCATC
+TTACTAGTATTTGCTGCCTGCGAACGAGCTCTAGGATTATCCCTACTAGT
+CATAGTCTCCAATACTTATGGAGTAGACCACGTACAAAACCTTAACCTCC
+TCCAATGCTAAAAATTATCATTCCCACAATCATACTTATGCCCCTTACAT
+GACTATCAAAAAAGAATATAATCTGAATCAACACTACAACCTATAGTCTA
+TTAATCAGCCTTATCAGCCTATCCCTCCTAAACCAACCTAGCAACAATAG
+CCTAAACTTCTCACTAATATTCTTCTCCGATCCCCTATCAGCCCCACTTC
+TGGTGTTGACAACATGACTACTGCCACTAATACTCATAGCCAGCCAACAC
+CATCTATCTAAGGAACCACTAATCCGAAAAAAACTCTACATCACCATGCT
+AACCATACTTCAAACTTTCCTAATCATGACTTTTACCGCCACAGAACTAA
+TCTCCTTCTACATCCTATTTGAAGCCACATTAGTTCCAACACTAATTATC
+ATCACCCGCTGAGGCAACCAAACAGAACGCCTGAACGCAGGCCTCTACTT
+CCTATTCTACACACTAATAGGTTCCCTCCCACTCTTAGTTGCACTAATCT
+CTATCCAAAACCTAACAGGCTCACTAAACTTCCTATTAATTCAATACTGA
+AACCAAGCACTACCCGACTCTTGATCCAATATTTTCCTATGACTAGCATG
+TATAATAGCATTCATAGTCAAAATACCGGTATATGGTCTTCACCTCTGAC
+TCCCAAAAGCCCATGTAGAAGCCCCAATTGCCGGATCCATAGTGCTAGCA
+GCCATTCTACTAAAACTAGGAGGCTACGGAATACTACGAATTACAACAAT
+ACTAAACCCCCAAACTAGCTTTATAGCCTACCCCTTCCTCATACTATCCC
+TGTGAGGAATAATCATAACTAGTTCCATCTGCTTGCGACAAACCGATCTA
+AAATCACTTATTGCATACTCCTCTGTCAGCCACATAGCCCTAGTAATCGT
+AGCCGTCCTCATCCAAACACCATGAAGTTATATAGGAGCTACAGCCCTAA
+TAATCGCTCACGGCCTTACATCATCAATACTATTCTGCCTGGCAAACTCA
+AATTACGAACGTACCCATAGCCGAACTATAATCCTAGCCCGCGGGCTTCA
+AACACTTCTTCCCCTTATAGCAGCCTGATGACTATTAGCCAGCCTAACCA
+ACCTGGCCCTCCCTCCCAGCATTAACCTAATTGGAGAGCTATTCGTAGTA
+ATATCATCATTCTCATGATCAAATATTACCATTATCCTAATAGGAGCCAA
+TATCACCATCACCGCCCTCTACTCCCTATACATACTAATCACAACACAAC
+GAGGGAAATACACACACCATATCAACAGCATTAAACCTTCATTTACACGA
+GAAAACGCACTCATGGCCCTCCACATGACTCCCCTACTACTCCTATCACT
+TAACCCTAAAATTATCCTAGGCTTTACGTACTGTAAATATAGTTTAACAA
+AAACACTAGATTGTGGATCTAGAAACAGAAACTTAATATTTCTTATTTAC
+CGAGAAAGTATGCAAGAACTGCTAATTCATGCCCCCATGTCCAACAAACA
+TGGCTCTCTCAAACTTTTAAAGGATAGGAGCTATCCGTTGGTCTTAGGAA
+CCAAAAAATTGGTGCAACTCCAAATAAAAGTAATCAACATGTTCTCCTCC
+CTCATACTAGTTTCACTATTAGTACTAACCCTCCCAATCATATTATCAAT
+CTTCAATACCTACAAAAACAGCACGTTCCCGCATCATGTAAAAAACACTA
+TCTCATATGCCTTCATTACTAGCCTAATTCCCACTATAATATTTATTCAC
+TCTGGACAAGAAACAATTATCTCAAACTGACACTGAATAACCATACAAAC
+CCTCAAACTATCCCTAAGCTTCAAACTAGATTACTTCTCAATAATTTTCG
+TACCAGTAGCCCTATTCGTAACATGATCTATTATGGAATTCTCCCTATGA
+TACATGCACTCAGATCCTTACATTACTCGATTTTTTAAATACTTACTTAC
+ATTCCTCATCACTATAATAATTCTAGTCACAGCTAACAACCTTTTCCAAC
+TGTTCATCGGATGGGAGGGAGTAGGCATCATGTCATTCTTACTAATCGGA
+TGATGATACGGCCGAACAGATGCCAACACCGCGGCCCTTCAAGCAATCCT
+TTATAACCGCATCGGGGATATCGGCTTCATCATGGCCATAGCCTGATTCC
+TATTCAACACCAACACATGAGACCTCCAACAAATCTTCATACTCGACCCC
+AACCTTACCAACCTCCCGCTCCTAGGCCTCCTCCTAGCCGCAACTGGCAA
+ATCCGCTCAATTTGGACTCCACCCATGACTTCCTTCAGCCATAGAGGGCC
+CTACACCAGTCTCAGCCCTACTCCACTCCAGCACAATAGTTGTAGCAGGC
+GTCTTCCTGCTAATCCGCTTCCATCCACTAATAGAAAACAACAAAACAAT
+CCAGTCACTTACCCTATGCCTAGGAGCCATCACCACACTATTCACAGCAA
+TCTGCGCACTCACTCAAAACGATATCAAAAAAATCATTGCTTTCTCCACC
+TCCAGCCAACTAGGCCTGATAATCGTAACCATCGGTATCAATCAACCCTA
+CCTAGCATTCCTCCACATTTGCACTCACGCATTCTTCAAAGCTATACTAT
+TTATATGTTCCGGATCCATTATCCACAGCCTAAATGACGAGCAAGATATC
+CGAAAAATAGGCGGACTATTTAATGCAATACCCTTCACCACCACATCTCT
+AATTATTGGCAGCCTTGCACTCACCGGAATTCCTTTCCTCACAGGCTTCT
+ACTCCAAAGACCTCATCATCGAAACCGCCAACACATCGTACACCAACGCC
+TGAGCCCTACTAATAACTCTCATTGCCACATCCCTCACAGCTGTCTACAG
+TACCCGAATCATCTTCTTTGCACTCCTAGGGCAACCCCGCTTCCTCCCTC
+TGACCTCAATCAACGAAAATAACCCCTTTCTAATTAACTCCATCAAACGC
+CTCTTAATTGGCAGCATTTTTGCCGGATTCTTCATCTCCAACAATATCTA
+CCCCACAACCGTCCCAGAAATAACCATACCTACTTACATAAAACTCACCG
+CCCTCGCAGTAACCATCCTAGGATTTACACTAGCCCTAGAACTAAGCTTG
+ATAACCCATAACTTAAAACTAGAACACTCCACCAACGTATTCAAATTCTC
+CAACCTCCTAGGATACTACCCAACAATTATACACCGACTCCCACCGCTCG
+CTAACCTATCAATAAGCCAAAAATCAGCATCACTTCTACTAGACTCAATC
+TGACTAGAAAACATCCTGCCAAAATCTATCTCCCAGTTCCAAATAAAAAC
+CTCGATCCTAATTTCCACCCAAAAAGGACAAATCAAATTATATTTCCTCT
+CATTCCTCATCACCCTTACCCTAAGCATACTACTTTTTAATCTCCACGAG
+TAACCTCTAAAATTACCAAGACCCCAACAAGCAACGATCAACCAGTCACA
+ATCACAACCCAAGCCCCATAACTATACAATGCAGCAGCCCCTATAATTTC
+CTCACTAAACGCCCCAGAATCTCCAGTATCATAAATAGCTCAAGCCCCCA
+CACCACTAAACTTAAACACTACCCCCACTTCCTCACTCTTCAGAACATAT
+AAAACCAACATAACCTCCATCAACAACCCTAAAAGAAATACCCCCATAAC
+AGTCGTATTAGACACCCATACCTCAGGATACTGCTCAGTAGCCATAGCCG
+TTGTATAACCAAAAACAACCAACATTCCTCCCAAATAAATCAAAAACACC
+ATCAACCCCAAAAAGGACCCTCCAAAATTCATAATAATACCACAACCTAC
+CCCTCCACTTACAATCAGCACTAAACCCCCATAAATAGGTGAAGGTTTTG
+AAGAAAACCCCACAAAACTAACAACAAAAATAACACTCAAAATAAACACA
+ATATATGTCATCATTATTCCCACGTGGAATCTAACCACGACCAATGACAT
+GAAAAATCATCGTTGTATTTCAACTATAAGAACACCAATGACAAACATCC
+GGAAATCTCACCCACTAATTAAAATCATCAATCACTCTTTTATTGACCTA
+CCAGCCCCCTCAAACATTTCATCATGATGAAACTTCGGCTCCCTCCTAGG
+AATCTGCCTAATCCTCCAAATCTTAACAGGCCTATTCCTAGCCATACACT
+ACACATCAGACACGACAACTGCCTTCTCATCCGTCACTCACATCTGCCGA
+GACGTTAACTACGGATGAATTATTCGCTACCTCCATGCCAACGGAGCATC
+AATATTTTTTATCTGCCTCTTCATTCACGTAGGACGCGGCCTCTACTACG
+GCTCTTACACATTCCTAGAGACATGAAACATTGGAATCATCCTACTTTTC
+ACAGTTATAGCTACAGCATTCATGGGCTATGTCCTACCATGAGGCCAAAT
+ATCCTTTTGAGGAGCAACAGTCATCACGAACCTCCTATCAGCAATTCCCT
+ACATCGGTACTACCCTCGTCGAGTGAATCTGAGGTGGATTCTCAGTAGAC
+AAAGCCACCCTTACCCGATTTTTTGCTTTCCACTTCATCCTACCCTTCAT
+CATCACAGCCCTGGTAGTCGTACATTTACTATTTCTTCACGAAACAGGAT
+CTAATAACCCCTCAGGAATCCCATCCGATATGGACAAAATCCCATTCCAC
+CCATATTATACAATTAAAGACATCCTAGGACTCCTCCTCCTGATCTTGCT
+CCTACTAACTCTAGTATTATTCTCCCCCGACCTCCTAGGAGACCCAGACA
+ACTACACCCCAGCTAACCCTCTCAGCACTCCCCCTCATATTAAACCAGAA
+TGGTACTTCCTGTTTGCCTACGCCATCCTACGCTCCATTCCCAACAAACT
+AGGCGGCGTATTAGCCCTAATCCTCTCCATCCTGATCCTAGCACTCATCC
+CCACCCTCCACATATCAAAACAACGAAGCATAATATTCCGGCCTCTCAGC
+CAATGCGTATTCTGACTCTTAGTGGCAGACTTACTGACACTAACATGAAT
+CGGCGGACAGCCAGTGGAACACCCATACGTAATTATCGGCCAACTGGCCT
+CAATCCTCTACTTCTCCCTAATTCTCATTTTTATACCACTCGCAAGCACC
+ATCGAAAACAATCTTCTAAAATGAAGAGTCCCTGTAGTATATCGCACATT
+ACCCTGGTCTTGTAAACCAGAAAAGGGGGAAAACGTTTCCTCCCAAGGAC
+TATCAAGGAAGAAGCTCTAGCTCCACCATCAACACCCAAAGCTGAAATTC
+TACTTAAACTATTCCTTGATTTCTTCCCCTAAACGACAACAATTTACCCT
+CATGTGCTATGTCAGTATCAGATTATACCCCCACATAACACCATACCCAC
+CTGACATGCAATATCTTATGAATGGCCTATGTACGTCGTGCATTAAATTG
+TCTGCCCCATGAATAATAAGCATGTACATAATATCATTTATCTTACATAA
+GTACATTATATTATTGATCGTGCATACCCCATCCAAGTCAAATCATTTCC
+AGTCAACACGCATATCACAGCCCATGTTCCACGAGCTTAATCACCAAGCC
+GCGGGAAATCAGCAACCCTCCCAACTACGTGTCCCAATCCTCGCTCCGGG
+CCCATCCAAACGTGGGGGTTTCTACAATGAAACTATACCTGGCATCTGGT
+TCTTTCTTCAGGGCCATTCCCACCCAACCTCGCCCATTCTTTCCCCTTAA
+ATAAGACATCTCGATGGACTAATGACTAATCAGCCCATGCTCACACATAA
+CTGTGATTTCATGCATTTGGTATCTTTTTATATTTGGGGATGCTATGACT
+CAGCTATGGCCGTCAAAGGCCTCGACGCAGTCAATTAAATTGAAGCTGGA
+CTTAAATTGAACGTTATTCCTCCGCATCAGCAACCATAAGGTGTTATTCA
+GTCCATGGTAGCGGGACATAGGAAACAAgtgcacctgtgcacctgtgcac
+ctgtgcacctgtgcacctgtgcacctgtgcacctgtgcacctgtgcacct
+gtgcacctgtgcacctgtgcacctgtgcacctgtgcacctgtgcacctgt
+gcacctgtgcacctgtgcacctgtgcacctgtgcacctgtgcacctgtgc
+acctgtgcacctgtgcacctgtgcacctgtgcacctgtgcacctgtgcac
+ctgtgcacctACCCGCGCAGTAAGCAAGTAATATAGCTTTCTTAATCAAA
+CCCCCCCTACCCCCCATTAAACTCCACATATGTACATTCAACACAATCTT
+GCCAAACCCCAAAAACAAGACTAAACAATGCACAATACTTCATGAAGCTT
+AACCCTCGCATGCCAACCATAATAACTCAACACACCTAACAATCTTAACA
+GAACTTTCCCCCCGCCATTAATACCAACATGCTACTTTAATCAATAAAAT
+TTCCATAGACAGGCATCCCCCTAGATCTAATTTTCTAAATCTGTCAACCC
+TTCTTCCCCC
diff -r d261f41a2a03 -r 36f438ce1f82 test-data/sam_to_bam_in1.sam
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/sam_to_bam_in1.sam Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,10 @@
+HWI-EAS91_1_30788AAXX:1:1:1513:715 16 chrM 9563 25 36M * 0 0 CTGACTACCACAACTAAACATCTATGCNNAAAAAAC I+-II?IDIIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1698:516 16 chrM 2735 25 36M * 0 0 TTTACACTCAGAGGTTCAACTCCTCTCNNTAACAAC I9IIIII5IIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1491:637 16 chrM 10864 25 36M * 0 0 TGTAGAAGCCCCAATTGCCGGATCCATNNTGCTAGC DBAIIIIIIIIIIIFIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1711:249 16 chrM 10617 25 36M * 0 0 ACCAAACAGAACGCCTGAACGCAGGCCNNTACTTCC IIIIIIIIIIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1634:211 0 chrM 9350 25 36M * 0 0 GAAGCAGNNGCTTGATACTGACACTTCGTCGACGTA IIIIIII""IIIIIIIIIIIIIIIIIIIIII9IIDF NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1218:141 16 chrM 14062 25 36M * 0 0 ACAAAACTAACAACAAAAATAACACTCNNAATAAAC I+IIII1IIIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1398:854 16 chrM 3921 25 36M * 0 0 CACCCTTCCCGTACTAATAAATCCCCTNNTCTTCAC IIIII=AIIIIIIIIIIIIIIBIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1310:991 16 chrM 10002 25 36M * 0 0 CTCCTATGCCTAGAAGGAATAATACTANNACTATTC I:2IEI:IIDIIIIII4IIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1716:413 0 chrM 6040 25 36M * 0 0 GATCCAANNCTTTATCAACACCTATTCTGATTCTTC IIIIIII""IIIIIIIIIIIIIIIIIIIIIIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1630:59 16 chrM 12387 25 36M * 0 0 TCATACTCGACCCCAACCTTACCAACCNNCCGCTCC FIIHII;IIIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
diff -r d261f41a2a03 -r 36f438ce1f82 test-data/sam_to_bam_in2.sam
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/sam_to_bam_in2.sam Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,10 @@
+HWI-EAS91_1_30788AAXX:1:1:1095:605 0 chrM 23 25 36M * 0 0 AAGCAAGNNACTGAAAATGCCTAGATGAGTATTCTT IIIIIII""IIIIIIIIIIIIIIIEIIIIIIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1650:1185 0 chrM 14956 25 36M * 0 0 ACCCCAGNNAACCCTCTCAGCACTCCCCCTCATATT IIIIIII""IIIIIIIIIIII6IIIIIIIII5I-II NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:799:192 16 chrM 8421 25 36M * 0 0 CCTGTAGCCCTAGCCGTGCGGCTAACCNNTAACATT II%::I<IIIIIEIII8IIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1082:719 16 chrM 7191 25 36M * 0 0 TAAATTAACCCATACCAGCACCATAGANNCTCAAGA <III0EII3+3I29I>III8AIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1746:1180 16 chrM 12013 25 36M * 0 0 CCTAAGCTTCAAACTAGATTACTTCTCNNTAATTTT IIIIIIIIFIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:606:460 0 chrM 4552 25 36M * 0 0 TTAATTTNNATTATAATAACACTCACAATATTCATA IIIIIII""IIIIIIIIIIIIIIIIII?I6IIIII6 NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1059:362 16 chrM 7348 25 36M * 0 0 GGCCACCAATGATACTGAAGCTACGAGNNTACCGAT II/<)2IIIIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1483:1161 16 chrM 15080 25 36M * 0 0 TCCTGATCCTAGCACTCATCCCCACCCNNCACATAT HIIIIIFIIAIHIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1273:600 16 chrM 13855 25 36M * 0 0 GTATTAGACACCCATACCTCAGGATACNNCTCAGTA IIIIIIIIIIIIIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
+HWI-EAS91_1_30788AAXX:1:1:1190:1283 16 chrM 15338 25 36M * 0 0 TATATCGCACATTACCCTGGTCTTGTANNCCAGAAA EIII?-IIIIIAIIIIIIIIIIIIIII""IIIIIII NM:i:1 X1:i:1 MD:Z:7N0N27
diff -r d261f41a2a03 -r 36f438ce1f82 test-data/sam_to_bam_out1.bam
Binary file test-data/sam_to_bam_out1.bam has changed
diff -r d261f41a2a03 -r 36f438ce1f82 test-data/sam_to_bam_out2.bam
Binary file test-data/sam_to_bam_out2.bam has changed
diff -r d261f41a2a03 -r 36f438ce1f82 tool-data/sam_fa_indices.loc.sample
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tool-data/sam_fa_indices.loc.sample Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,27 @@
+#This is a sample file distributed with Galaxy that enables tools
+#to use a directory of Samtools indexed sequences data files. You will need
+#to create these data files and then create a sam_fa_indices.loc file
+#similar to this one (store it in this directory ) that points to
+#the directories in which those files are stored. The sam_fa_indices.loc
+#file has this format (white space characters are TAB characters):
+#
+#<index> <seq> <location>
+#
+#So, for example, if you had hg18 indexed stored in
+#/depot/data2/galaxy/sam/,
+#then the sam_fa_indices.loc entry would look like this:
+#
+#hg18 /depot/data2/galaxy/sam/hg18.fa
+#
+#and your /depot/data2/galaxy/sam/ directory
+#would contain hg18.fa and hg18.fa.fai files:
+#
+#-rw-r--r-- 1 james universe 830134 2005-09-13 10:12 hg18.fa
+#-rw-r--r-- 1 james universe 527388 2005-09-13 10:12 hg18.fa.fai
+#
+#Your sam_fa_indices.loc file should include an entry per line for
+#each index set you have stored. The file in the path does actually
+#exist, but it should never be directly used. Instead, the name serves
+#as a prefix for the index file. For example:
+#
+#hg18 /depot/data2/galaxy/sam/hg18.fa
diff -r d261f41a2a03 -r 36f438ce1f82 tool_conf.xml.sample
--- a/tool_conf.xml.sample Fri Aug 28 15:29:53 2009 -0400
+++ b/tool_conf.xml.sample Fri Aug 28 15:59:16 2009 -0400
@@ -331,9 +331,13 @@
<tool file="metag_tools/megablast_xml_parser.xml" />
<tool file="metag_tools/blat_wrapper.xml" />
<tool file="metag_tools/mapping_to_ucsc.xml" />
- <tool file="sr_mapping/bwa_wrapper.xml" />
</section>
<section name="Tracks" id="tracks">
<tool file="visualization/genetrack.xml" />
</section>
+ <section name="SAM Tools" id="samtools">
+ <tool file="samtools/sam_to_bam.xml" />
+ <tool file="samtools/sam_merge.xml" />
+ <tool file="samtools/sam_pileup.xml" />
+ </section>
</toolbox>
diff -r d261f41a2a03 -r 36f438ce1f82 tools/samtools/sam_merge.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/samtools/sam_merge.py Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,21 @@
+#! /usr/bin/python
+
+import os, sys
+
+def stop_err( msg ):
+ sys.stderr.write( msg )
+ sys.exit()
+
+def __main__():
+ infile = sys.argv[1]
+ outfile = sys.argv[2]
+ if len( sys.argv ) < 3:
+ stop_err( 'No files to merge' )
+ filenames = sys.argv[3:]
+ cmd1 = 'samtools merge %s %s %s' % (outfile, infile, ' '.join(filenames))
+ try:
+ os.system(cmd1)
+ except Exception, eq:
+ stop_err('Error running SAMtools merge tool\n' + str(eq))
+
+if __name__ == "__main__" : __main__()
\ No newline at end of file
diff -r d261f41a2a03 -r 36f438ce1f82 tools/samtools/sam_merge.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/samtools/sam_merge.xml Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,32 @@
+<tool id="sam_merge" name="Merge BAM Files" version="1.0.0">
+ <description>merges BAM files together</description>
+ <command interpreter="python">
+ sam_merge.py
+ $input1
+ $output1
+ $input2
+ #for $i in $inputs
+ ${i.input}
+ #end for
+ </command>
+ <inputs>
+ <param name="input1" label="First file" type="data" format="bam" />
+ <param name="input2" label="with file" type="data" format="bam" help="Need to add more files? Use controls below." />
+ <repeat name="inputs" title="Input Files">
+ <param name="input" label="Add file" type="data" format="bam" />
+ </repeat>
+ </inputs>
+ <outputs>
+ <data name="output1" format="bam" />
+ </outputs>
+ <!-- bam files are binary and not sniffable so can't be uploaded without being corrupted, so no tests -->
+ <help>
+
+**What it does**
+
+This tool uses SAMTools_' merge command to merge any number of BAM files together into one BAM file.
+
+.. _SAMTools: http://samtools.sourceforge.net/samtools.shtml
+
+ </help>
+</tool>
diff -r d261f41a2a03 -r 36f438ce1f82 tools/samtools/sam_merge_code.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/samtools/sam_merge_code.py Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,35 @@
+import sets
+from galaxy.tools.parameters import DataToolParameter
+
+def validate_input( trans, error_map, param_values, page_param_map ):
+ dbkeys = sets.Set()
+ data_param_names = sets.Set()
+ data_params = 0
+ for name, param in page_param_map.iteritems():
+ if isinstance( param, DataToolParameter ):
+ # for each dataset parameter
+ if param_values.get(name, None) != None:
+ dbkeys.add( param_values[name].dbkey )
+ data_params += 1
+ # check meta data
+# try:
+# param = param_values[name]
+# startCol = int( param.metadata.startCol )
+# endCol = int( param.metadata.endCol )
+# chromCol = int( param.metadata.chromCol )
+# if param.metadata.strandCol is not None:
+# strandCol = int ( param.metadata.strandCol )
+# else:
+# strandCol = 0
+# except:
+# error_msg = "The attributes of this dataset are not properly set. " + \
+# "Click the pencil icon in the history item to set the chrom, start, end and strand columns."
+# error_map[name] = error_msg
+ data_param_names.add( name )
+ if len( dbkeys ) > 1:
+ for name in data_param_names:
+ error_map[name] = "All datasets must belong to same genomic build, " \
+ "this dataset is linked to build '%s'" % param_values[name].dbkey
+ if data_params != len(data_param_names):
+ for name in data_param_names:
+ error_map[name] = "A dataset of the appropriate type is required"
\ No newline at end of file
diff -r d261f41a2a03 -r 36f438ce1f82 tools/samtools/sam_pileup.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/samtools/sam_pileup.py Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,89 @@
+#! /usr/bin/python
+
+"""
+Creates a pileup file from a bam file and a reference.
+
+usage: %prog [options]
+ -i, --input1=i: bam file
+ -o, --output1=o: Output pileup
+ -r, --ref=r: Reference file type
+ -n, --ownFile=n: User-supplied fasta reference file
+ -d, --dbkey=d: dbkey of user-supplied file
+ -x, --indexDir=x: Index directory
+ -b, --bamIndex=b: BAM index file
+
+usage: %prog input1 output1 ref_type refFile ownFile dbkey index_dir bam_index
+"""
+
+import os, sys, tempfile
+from galaxy import eggs
+import pkg_resources; pkg_resources.require( "bx-python" )
+from bx.cookbook import doc_optparse
+
+def stop_err( msg ):
+ sys.stderr.write( msg )
+ sys.exit()
+
+def check_seq_file( dbkey, GALAXY_DATA_INDEX_DIR ):
+ seq_file = "%s/sam_fa_indices.loc" % GALAXY_DATA_INDEX_DIR
+ seq_path = ''
+ for line in open( seq_file ):
+ line = line.rstrip( '\r\n' )
+ if line and not line.startswith( "#" ) and line.startswith( 'index' ):
+ fields = line.split( '\t' )
+ if len( fields ) < 3:
+ continue
+ if fields[1] == dbkey:
+ seq_path = fields[2].strip()
+ break
+ return seq_path
+
+def __main__():
+ #Parse Command Line
+ options, args = doc_optparse.parse( __doc__ )
+ seq_path = check_seq_file( options.dbkey, options.indexDir )
+ tmp_dir = tempfile.gettempdir()
+ os.chdir(tmp_dir)
+ tmpf0 = tempfile.NamedTemporaryFile(dir=tmp_dir)
+ tmpf0bam = '%s.bam' % tmpf0.name
+ tmpf0bambai = '%s.bam.bai' % tmpf0.name
+ tmpf1 = tempfile.NamedTemporaryFile(dir=tmp_dir)
+ tmpf1fai = '%s.fai' % tmpf1.name
+ cmd1 = None
+ cmd2 = 'cp %s %s; cp %s %s' % (options.input1, tmpf0bam, options.bamIndex, tmpf0bambai)
+ cmd3 = 'samtools pileup -f %s %s > %s 2> /dev/null'
+ if options.ref =='indexed':
+ full_path = "%s.fai" % seq_path
+ if not os.path.exists( full_path ):
+ stop_err( "No sequences are available for '%s', request them by reporting this error." % options.dbkey )
+ cmd3 = cmd3 % (seq_path, tmpf0bam, options.output1)
+ elif options.ref == 'history':
+ cmd1 = 'cp %s %s; cp %s.fai %s' % (options.ownFile, tmpf1.name, options.ownFile, tmpf1fai)
+ cmd3 = cmd3 % (tmpf1.name, tmpf0bam, options.output1)
+ # index reference if necessary
+ if cmd1:
+ try:
+ os.system(cmd1)
+ except Exception, eq:
+ stop_err('Error moving reference sequence\n' + str(eq))
+ # copy bam index to working directory
+ try:
+ os.system(cmd2)
+ except Exception, eq:
+ stop_err('Error moving files to temp directory\n' + str(eq))
+ # perform pileup command
+ try:
+ os.system(cmd3)
+ except Exception, eq:
+ stop_err('Error running SAMtools merge tool\n' + str(eq))
+ # clean up temp files
+ tmpf1.close()
+ tmpf0.close()
+ if os.path.exists(tmpf0bam):
+ os.remove(tmpf0bam)
+ if os.path.exists(tmpf0bambai):
+ os.remove(tmpf0bambai)
+ if os.path.exists(tmpf1fai):
+ os.remove(tmpf0bambai)
+
+if __name__ == "__main__" : __main__()
diff -r d261f41a2a03 -r 36f438ce1f82 tools/samtools/sam_pileup.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/samtools/sam_pileup.xml Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,50 @@
+<tool id="sam_pileup" name="SAM Pileup Format" version="1.0.0">
+ <description>generates the pileup format for a provided BAM file</description>
+ <command interpreter="python">
+ sam_pileup.py
+ --input1=$input1
+ --output=$output1
+ --ref=$refOrHistory.reference
+ #if $refOrHistory.reference == "history":
+ --ownFile=$refOrHistory.ownFile
+ #else:
+ --ownFile="None"
+ #end if
+ --dbkey=${input1.metadata.dbkey}
+ --indexDir=${GALAXY_DATA_INDEX_DIR}
+ --bamIndex=${input1.metadata.bam_index}
+ </command>
+ <inputs>
+ <conditional name="refOrHistory">
+ <param name="reference" type="select" label="Will you select a reference genome from your history or use a built-in index?">
+ <option value="indexed">Use a built-in index</option>
+ <option value="history">Use one from the history</option>
+ </param>
+ <when value="indexed">
+ <param name="input1" type="data" format="bam" label="Select the BAM file to generate the pileup file for">
+ <validator type="unspecified_build" />
+ <validator type="dataset_metadata_in_file" filename="sam_fa_indices.loc" metadata_name="dbkey" metadata_column="1" message="Sequences are not currently available for the specified build." line_startswith="index" />
+ </param>
+ </when>
+ <when value="history">
+ <param name="input1" type="data" format="sam, bam" label="Select the BAM file to generate the pileup file for" />
+ <param name="ownFile" type="data" format="fasta" metadata_name="dbkey" label="Select a reference genome" />
+ </when>
+ </conditional>
+ </inputs>
+ <outputs>
+ <data format="tabular" name="output1" />
+ </outputs>
+ <!-- tests are not possible because bam is a non-sniffable binary format and cannot be uploaded without being corrupted -->
+ <help>
+
+**What it does**
+
+Uses SAMTools_' pileup command to produce a file in the pileup format based on the provided BAM file.
+
+.. _SAMTools: http://samtools.sourceforge.net/samtools.shtml
+
+ </help>
+</tool>
+
+
diff -r d261f41a2a03 -r 36f438ce1f82 tools/samtools/sam_to_bam.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/samtools/sam_to_bam.py Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,85 @@
+#! /usr/bin/python
+
+"""
+Converts SAM data to BAM format.
+
+usage: %prog [options]
+ -i, --input1=i: SAM file to be converted
+ -d, --dbkey=d: dbkey value
+ -r, --ref_file=r: Reference file if choosing from history
+ -o, --output1=o: BAM output
+ -x, --index_dir=x: Index directory
+
+usage: %prog input_file dbkey ref_list output_file
+"""
+
+import os, sys, tempfile
+from galaxy import eggs
+import pkg_resources; pkg_resources.require( "bx-python" )
+from bx.cookbook import doc_optparse
+
+def stop_err( msg ):
+ sys.stderr.write( "%s\n" % msg )
+ sys.exit()
+
+def check_seq_file( dbkey, GALAXY_DATA_INDEX_DIR ):
+ seq_file = "%s/sam_fa_indices.loc" % GALAXY_DATA_INDEX_DIR
+ seq_path = ''
+ for line in open( seq_file ):
+ line = line.rstrip( '\r\n' )
+ if line and not line.startswith( "#" ) and line.startswith( 'index' ):
+ fields = line.split( '\t' )
+ if len( fields ) < 3:
+ continue
+ if fields[1] == dbkey:
+ seq_path = fields[2].strip()
+ break
+ return seq_path
+
+def __main__():
+ #Parse Command Line
+ options, args = doc_optparse.parse( __doc__ )
+ seq_path = check_seq_file( options.dbkey, options.index_dir )
+ tmp_dir = tempfile.gettempdir()
+ os.chdir(tmp_dir)
+ tmpf1 = tempfile.NamedTemporaryFile(dir=tmp_dir)
+ tmpf1fai = '%s.fai' % tmpf1.name
+ tmpf2 = tempfile.NamedTemporaryFile(dir=tmp_dir)
+ tmpf3 = tempfile.NamedTemporaryFile(dir=tmp_dir)
+ tmpf3bam = '%s.bam' % tmpf3.name
+ if options.ref_file == "None":
+ full_path = "%s.fai" % seq_path
+ if not os.path.exists( full_path ):
+ stop_err( "No sequences are available for '%s', request them by reporting this error." % options.dbkey )
+ cmd1 = "cp %s %s; cp %s %s" % (seq_path, tmpf1.name, full_path, tmpf1fai)
+ else:
+ cmd1 = "cp %s %s; samtools faidx %s 2>/dev/null" % (options.ref_file, tmpf1.name, tmpf1.name)
+ cmd2 = "samtools view -bt %s -o %s %s 2>/dev/null" % (tmpf1fai, tmpf2.name, options.input1)
+ cmd3 = "samtools sort %s %s 2>/dev/null" % (tmpf2.name, tmpf3.name)
+ cmd4 = "cp %s %s" % (tmpf3bam, options.output1)
+ # either create index based on fa file or copy provided index to temp directory
+ try:
+ os.system(cmd1)
+ except Exception, eq:
+ stop_err("Error creating the reference list index.\n" + str(eq))
+ # create original bam file
+ try:
+ os.system(cmd2)
+ except Exception, eq:
+ stop_err("Error running view command.\n" + str(eq))
+ # sort original bam file to produce sorted output bam file
+ try:
+ os.system(cmd3)
+ os.system(cmd4)
+ except Exception, eq:
+ stop_err("Error sorting data and creating output file.\n" + str(eq))
+ # cleanup temp files
+ tmpf1.close()
+ tmpf2.close()
+ tmpf3.close()
+ if os.path.exists(tmpf1fai):
+ os.remove(tmpf1fai)
+ if os.path.exists(tmpf3bam):
+ os.remove(tmpf3bam)
+
+if __name__=="__main__": __main__()
diff -r d261f41a2a03 -r 36f438ce1f82 tools/samtools/sam_to_bam.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/samtools/sam_to_bam.xml Fri Aug 28 15:59:16 2009 -0400
@@ -0,0 +1,59 @@
+<tool id="sam_to_bam" name="SAM-to-BAM" version="1.0.0">
+ <description>converts SAM format to BAM format</description>
+ <command interpreter="python">
+ sam_to_bam.py
+ --input1=$source.input1
+ --dbkey=${input1.metadata.dbkey}
+ #if $source.indexSource == "history":
+ --ref_file=$ref_file
+ #else
+ --ref_file="None"
+ #end if
+ --output1=$output1
+ --index_dir=${GALAXY_DATA_INDEX_DIR}
+ </command>
+ <inputs>
+ <conditional name="source">
+ <param name="indexSource" type="select" label="Choose the source for the reference list">
+ <option value="built_in">Built-in</option>
+ <option value="history">History</option>
+ </param>
+ <when value="built_in">
+ <param name="input1" type="data" format="sam" label="SAM File to Convert">
+ <validator type="unspecified_build" />
+ <validator type="dataset_metadata_in_file" filename="sam_fa_indices.loc" metadata_name="dbkey" metadata_column="1" message="Sequences are not currently available for the specified build." line_startswith="index" />
+ </param>
+ </when>
+ <when value="history">
+ <param name="input1" type="data" format="sam" label="SAM File to Convert" />
+ <param name="ref_file" type="data" format="fasta" label="Choose the reference file" />
+ </when>
+ </conditional>
+ </inputs>
+ <outputs>
+ <data name="output1" format="bam"/>
+ </outputs>
+ <tests>
+ <test>
+ <param name="indexSource" value="history" />
+ <param name="input1" value="sam_to_bam_in1.sam" ftype="sam" />
+ <param name="ref_file" value="chrM.fa" ftype="fasta" />
+ <output name="output1" file="sam_to_bam_out1.bam" />
+ </test>
+ <test>
+ <param name="indexSource" value="built_in" />
+ <param name="input1" value="sam_to_bam_in2.sam" ftype="sam" dbkey="chrM" />
+ <param name="ref_file" value="chrM.fa" ftype="fasta" />
+ <output name="output1" file="sam_to_bam_out2.bam" />
+ </test>
+ </tests>
+ <help>
+
+**What it does**
+
+This tool uses the SAMTools_ toolkit to produce a BAM file based on a sorted input SAM file.
+
+.. _SAMTools: http://samtools.sourceforge.net/samtools.shtml
+
+ </help>
+</tool>
1
0