We recently installed NCBI BLAST + to our local Galaxy instance and now we need to provide the possibility to filter/mask ... by taxon id (taxid) using the command line option (-window_masker_taxid) of BLAST (cf. http://www.ncbi.nlm.nih.gov/books/NBK1763/) before the job will be performed in one task.
However searching in the documentation and the mailing list, I did not find anything about this.
So it would be great to provide the possibility to subselect by taxid (e.g. NCBI BLAST server http://blast.ncbi.nlm.nih.gov/Blast.cgi) by changing the BLAST integration.
We believe this could be of common interest. It would be great if anybody could comment on this.
Thanks a lot, Thomas
tool output we came across the following method. Given that the current
default is to disallow such activities, we thought it might be useful to
bring it to your attention.
The attached file provides an example, which, when uploaded to a history
and viewed produces a popup on the current stable release of galaxy (local
install and https://usegalaxy.org).
We are trying to enable job-splitting and merging on our Galaxy blast tools. We set the following parameters in the universe_wsgi.ini:
use_tasked_jobs = True
local_task_queue_workers = 2
Then we executed the tool "NCBI BLAST+ blastn" (from devteam's NCBI BLAST+) on a query set of 10,000 sequences. The tool's xml has the following parallelism tag:
And as expected, the job was split into 10 tasks (IDs 0-9), each one with 1000 sequences. Four tasks started running simultaneously.
Those four tasks have completed, but now there is no indication that the other 6 tasks have begun running, and the job on the GUI is still in the yellow-state in History.
It has been nearly an hour, and still no sign of those tasks or any job running. What could be the cause of this?
ICS – Sr. Bioinformatics Engineer
J. Craig Venter Institute
Has anyone set up a local toolshed with external authentication?
Is this expected to work?
I have external auth working, but tools cannot be installed (403 forbidden) unless I turn of authentication.
If i turn on remote-auth, i have to configure the webserver to ask for credentials otherwise i get an error page.
It would make sense to have the webserver request credentials only for requests to a login page, but I don’t see how to do that.
For now I’ve just turned off remote-auth.
Brad Langhorst, Ph.D.
Applications and Product Development Scientist
something strange happens with Cufflinks in our Galaxy server. When a
user deletes a running Cufflinks job in fact the associated Cufflinks
process(es) are not terminated. Apart from unneccessary CPU usage, this
prevents other jobs from starting if the max jobs limit has been already
reached by the user.
This happens only with Cufflinks, Cuffcompare for comparison behaves
Federico Zambelli, Ph.D.
Bioinformatics, Evolution and Comparative Genomics Lab
Dept. of Biosciences
University of Milano - Italy
What can be asserted without proof can be dismissed without proof.
I have galaxy running as the galaxy user on a virtual machine. I've enabled ldap authentication and the galaxy software is running on our HPC file system.
I've created a scratch folder on our HPC file system for users to use to upload data from a filesystem path. I'm able to upload a data file but when I attempt to import the data into my 'current history' which I think I need to do in order to run a job using this data... I get a message that reads
'You do not have permission to add datasets to 1 requested histories.'
I've tried looking at paster.log to no avail. Is there a better log to use... any help would be greatly appreciated.
FSU Research Computing Center
What’s the right way to contribute additions to other people’s repos?
I have extended the picard wrappers to support the rna-seq metrics and downsample sam tools, but I don’t want to start a new repo just for these.
I have attached the output from hg outgoing -p in case that’s useful
I am trying to import an large MAF file (gzipped at 5gb, unzipped =60gb). When I import the file, I get the following message upon completion:
"An error occurred setting the metadata for this dataset. You may be able to set it manually or retry auto-detection"
However, when I open setting and try auto-detection, I continue to get the error. Is this an issue with the file size (I can upload smaller MAFs of 20Gb without a problem)? I am certain the file is not corrupt since it was downloaded through FTP from UCSC (also tried directly importing from UCSC into galaxy, but got the same error).