galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
October 2013
- 1 participants
- 226 discussions
commit/galaxy-central: guerler: Fix textarea background color in new upload front end
by commits-noreply@bitbucket.org 16 Oct '13
by commits-noreply@bitbucket.org 16 Oct '13
16 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/940a310179c4/
Changeset: 940a310179c4
User: guerler
Date: 2013-10-16 11:16:55
Summary: Fix textarea background color in new upload front end
Affected #: 2 files
diff -r f7760ffba1883eb14617890066778f38021f2d50 -r 940a310179c4ad9cfe7dabae2ac71402608a636a static/style/blue/base.css
--- a/static/style/blue/base.css
+++ b/static/style/blue/base.css
@@ -1129,7 +1129,7 @@
.upload-box .table th{text-align:center;white-space:nowrap}
.upload-box .table td{margin:0px;paddign:0px}
.upload-box .title{width:130px;word-wrap:break-word;font-size:11px}
-.upload-box .text{position:absolute;display:none}.upload-box .text .text-content{font-size:11px;width:100%;height:50px;resize:none}
+.upload-box .text{position:absolute;display:none}.upload-box .text .text-content{font-size:11px;width:100%;height:50px;resize:none;background:inherit}
.upload-box .text .text-info{font-size:11px;color:#999}
.upload-box .extension{width:100px;font-size:11px}
.upload-box .genome{width:150px;font-size:11px}
diff -r f7760ffba1883eb14617890066778f38021f2d50 -r 940a310179c4ad9cfe7dabae2ac71402608a636a static/style/src/less/upload.less
--- a/static/style/src/less/upload.less
+++ b/static/style/src/less/upload.less
@@ -48,6 +48,7 @@
width : 100%;
height : 50px;
resize : none;
+ background : inherit;
}
.text-info {
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
16 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/f7760ffba188/
Changeset: f7760ffba188
User: guerler
Date: 2013-10-16 10:53:59
Summary: Fix for Firefox
Affected #: 1 file
diff -r 41d3434f8506bb13380bd072e64194519e32a36e -r f7760ffba1883eb14617890066778f38021f2d50 static/scripts/galaxy.upload.js
--- a/static/scripts/galaxy.upload.js
+++ b/static/scripts/galaxy.upload.js
@@ -164,15 +164,17 @@
// get text component
var text = it.find('#text');
+ // get padding
+ var padding = 8;
+
// get dimensions
- var padding = parseInt($(text.parent()).css('padding'));
var width = it.width() - 2 * padding;
- var height = it.height();
-
+ var height = it.height() - padding;
+
// set dimensions
- text.width(width);
+ text.css('width', width + 'px');
text.css('top', height + 'px');
- it.height(height + text.height() + padding);
+ it.height(height + text.height() + 2 * padding);
// show text field
text.show();
@@ -471,7 +473,7 @@
'Close' : function() {self.modal.hide()},
},
height : '400',
- width : '850'
+ width : '900'
});
// set element
@@ -611,11 +613,13 @@
{
// construct template
var tmpl = '<tr id="' + id.substr(1) + '" class="upload-item">' +
- '<td style="position: relative;">' +
- '<div id="title" class="title"></div>' +
- '<div id="text" class="text">' +
- '<div class="text-info">You may specify a list of URLs (one per line) or paste the contents of a file.</div>' +
- '<textarea id="text-content" class="text-content form-control"></textarea>' +
+ '<td>' +
+ '<div style="position: relative;">' +
+ '<div id="title" class="title"></div>' +
+ '<div id="text" class="text">' +
+ '<div class="text-info">You may specify a list of URLs (one per line) or paste the contents of a file.</div>' +
+ '<textarea id="text-content" class="text-content form-control"></textarea>' +
+ '</div>' +
'</div>' +
'</td>' +
'<td><div id="size" class="size"></div></td>';
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: guerler: Show file size for manually created files
by commits-noreply@bitbucket.org 16 Oct '13
by commits-noreply@bitbucket.org 16 Oct '13
16 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/41d3434f8506/
Changeset: 41d3434f8506
User: guerler
Date: 2013-10-16 10:05:53
Summary: Show file size for manually created files
Affected #: 1 file
diff -r f170cf78702b3913a5e09ada1c75db2800ebc982 -r 41d3434f8506bb13380bd072e64194519e32a36e static/scripts/galaxy.upload.js
--- a/static/scripts/galaxy.upload.js
+++ b/static/scripts/galaxy.upload.js
@@ -144,6 +144,10 @@
// add functionality to remove button
var self = this;
it.find('#symbol').on('click', function() { self.event_remove (index) });
+ it.find('#text-content').on('keyup', function() {
+ var count = it.find('#text-content').val().length;
+ it.find('#size').html(self.size_to_string (count));
+ });
// initialize progress
this.event_progress(index, file, 0);
@@ -437,8 +441,8 @@
}
},
- // add (pseudo) file
- event_add : function ()
+ // create (pseudo) file
+ event_create : function ()
{
this.uploadbox.add([{ name : 'New File', size : -1 }]);
},
@@ -460,7 +464,7 @@
body : this.template('upload-box', 'upload-info'),
buttons : {
'Select' : function() {self.uploadbox.select()},
- 'Create' : function() {self.event_add()},
+ 'Create' : function() {self.event_create()},
'Upload' : function() {self.event_start()},
'Pause' : function() {self.event_stop()},
'Reset' : function() {self.event_reset()},
@@ -511,7 +515,8 @@
if (size >= 100000000) { size = size / 100000000; unit = 'GB'; } else
if (size >= 100000) { size = size / 100000; unit = 'MB'; } else
if (size >= 100) { size = size / 100; unit = 'KB'; } else
- if (size > 0) { size = size * 10; unit = 'b'; } else return '?';
+ if (size > 0) { size = size * 10; unit = 'b'; } else
+ return '<strong>-</strong>';
// return formatted string
return '<strong>' + (Math.round(size) / 10) + '</strong> ' + unit;
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: guerler: Add url, text input field to upload form
by commits-noreply@bitbucket.org 16 Oct '13
by commits-noreply@bitbucket.org 16 Oct '13
16 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/f170cf78702b/
Changeset: f170cf78702b
User: guerler
Date: 2013-10-16 09:36:17
Summary: Add url, text input field to upload form
Affected #: 4 files
diff -r ecd1fe4c471d2cf9ede923c43f27d4e380f961d1 -r f170cf78702b3913a5e09ada1c75db2800ebc982 static/scripts/galaxy.upload.js
--- a/static/scripts/galaxy.upload.js
+++ b/static/scripts/galaxy.upload.js
@@ -71,7 +71,7 @@
on_click : function(e) { self.event_show(e) },
on_unload : function() {
if (self.counter.running > 0)
- return "Currently uploads are running.";
+ return "Several uploads are still processing.";
},
with_number : true
});
@@ -129,9 +129,6 @@
// add upload item
$(this.el).find('tbody:last').append(this.template_row(id));
- // scroll to bottom
- //$(this.el).scrollTop($(this.el).prop('scrollHeight'));
-
// access upload item
var it = this.get_upload_item(index);
@@ -156,6 +153,26 @@
// update screen
this.update_screen();
+
+ // activate text field if file content is zero
+ if (file.size == -1)
+ {
+ // get text component
+ var text = it.find('#text');
+
+ // get dimensions
+ var padding = parseInt($(text.parent()).css('padding'));
+ var width = it.width() - 2 * padding;
+ var height = it.height();
+
+ // set dimensions
+ text.width(width);
+ text.css('top', height + 'px');
+ it.height(height + text.height() + padding);
+
+ // show text field
+ text.show();
+ }
},
// start
@@ -175,8 +192,13 @@
var current_history = Galaxy.currHistoryPanel.model.get('id');
var file_type = it.find('#extension').val();
var genome = it.find('#genome').val();
+ var url_paste = it.find('#text-content').val();
var space_to_tabs = it.find('#space_to_tabs').is(':checked');
+ // validate
+ if (!url_paste && !(file.size > 0))
+ return null;
+
// configure uploadbox
this.uploadbox.configure({url : galaxy_config.root + "api/tools/", paramname : "files_0|file_data"});
@@ -186,6 +208,7 @@
tool_input['file_type'] = file_type;
tool_input['files_0|NAME'] = file.name;
tool_input['files_0|type'] = 'upload_dataset';
+ tool_input['files_0|url_paste'] = url_paste;
tool_input['space_to_tabs'] = space_to_tabs;
// setup data
@@ -193,7 +216,7 @@
data['history_id'] = current_history;
data['tool_id'] = 'upload1';
data['inputs'] = JSON.stringify(tool_input);
-
+
// return additional data to be send with file
return data;
},
@@ -288,7 +311,7 @@
},
// start upload process
- event_upload : function()
+ event_start : function()
{
// check
if (this.counter.announce == 0 || this.counter.running > 0)
@@ -307,6 +330,7 @@
symbol.addClass(self.state.queued);
// disable options
+ $(this).find('#text-content').attr('disabled', true);
$(this).find('#genome').attr('disabled', true);
$(this).find('#extension').attr('disabled', true);
$(this).find('#space_to_tabs').attr('disabled', true);
@@ -318,21 +342,21 @@
this.update_screen();
// initiate upload procedure in plugin
- this.uploadbox.upload();
+ this.uploadbox.start();
},
// pause upload process
- event_pause : function()
+ event_stop : function()
{
// check
if (this.counter.running == 0)
return;
// request pause
- this.uploadbox.pause();
+ this.uploadbox.stop();
// set html content
- $('#upload-info').html('Queueing will pause after completing the current file...');
+ $('#upload-info').html('Queue will pause after completing the current file...');
},
// queue is done
@@ -355,6 +379,7 @@
symbol.addClass(self.state.init);
// disable options
+ $(this).find('#text-content').attr('disabled', false);
$(this).find('#genome').attr('disabled', false);
$(this).find('#extension').attr('disabled', false);
$(this).find('#space_to_tabs').attr('disabled', false);
@@ -412,6 +437,12 @@
}
},
+ // add (pseudo) file
+ event_add : function ()
+ {
+ this.uploadbox.add([{ name : 'New File', size : -1 }]);
+ },
+
// show/hide upload frame
event_show : function (e)
{
@@ -428,13 +459,14 @@
title : 'Upload files from your local drive',
body : this.template('upload-box', 'upload-info'),
buttons : {
- 'Select' : function() {self.uploadbox.select()},
- 'Upload' : function() {self.event_upload()},
- 'Pause' : function() {self.event_pause()},
- 'Reset' : function() {self.event_reset()},
- 'Close' : function() {self.modal.hide()}
+ 'Select' : function() {self.uploadbox.select()},
+ 'Create' : function() {self.event_add()},
+ 'Upload' : function() {self.event_start()},
+ 'Pause' : function() {self.event_stop()},
+ 'Reset' : function() {self.event_reset()},
+ 'Close' : function() {self.modal.hide()},
},
- height : '350',
+ height : '400',
width : '850'
});
@@ -475,13 +507,14 @@
{
// identify unit
var unit = "";
- if (size >= 100000000000) { size = size / 100000000000; unit = "TB"; } else
- if (size >= 100000000) { size = size / 100000000; unit = "GB"; } else
- if (size >= 100000) { size = size / 100000; unit = "MB"; } else
- if (size >= 100) { size = size / 100; unit = "KB"; } else
- { size = size * 10; unit = "b"; }
+ if (size >= 100000000000) { size = size / 100000000000; unit = 'TB'; } else
+ if (size >= 100000000) { size = size / 100000000; unit = 'GB'; } else
+ if (size >= 100000) { size = size / 100000; unit = 'MB'; } else
+ if (size >= 100) { size = size / 100; unit = 'KB'; } else
+ if (size > 0) { size = size * 10; unit = 'b'; } else return '?';
+
// return formatted string
- return "<strong>" + (Math.round(size) / 10) + "</strong> " + unit;
+ return '<strong>' + (Math.round(size) / 10) + '</strong> ' + unit;
},
// set screen
@@ -494,7 +527,7 @@
// check default message
if(this.counter.announce == 0)
{
- if (this.uploadbox.compatible)
+ if (this.uploadbox.compatible())
message = 'Drag&drop files into this box or click \'Select\' to select files!';
else
message = 'Unfortunately, your browser does not support multiple file uploads or drag&drop.<br>Please upgrade to i.e. Firefox 4+, Chrome 7+, IE 10+, Opera 12+ or Safari 6+.'
@@ -532,10 +565,14 @@
// select upload button
if (this.counter.running == 0)
+ {
this.modal.enableButton('Select');
- else
+ this.modal.enableButton('Create');
+ } else {
this.modal.disableButton('Select');
-
+ this.modal.disableButton('Create');
+ }
+
// table visibility
if (this.counter.announce + this.counter.success + this.counter.error > 0)
$(this.el).find('table').show();
@@ -569,7 +606,13 @@
{
// construct template
var tmpl = '<tr id="' + id.substr(1) + '" class="upload-item">' +
- '<td><div id="title" class="title"></div></td>' +
+ '<td style="position: relative;">' +
+ '<div id="title" class="title"></div>' +
+ '<div id="text" class="text">' +
+ '<div class="text-info">You may specify a list of URLs (one per line) or paste the contents of a file.</div>' +
+ '<textarea id="text-content" class="text-content form-control"></textarea>' +
+ '</div>' +
+ '</td>' +
'<td><div id="size" class="size"></div></td>';
// add file type selectore
diff -r ecd1fe4c471d2cf9ede923c43f27d4e380f961d1 -r f170cf78702b3913a5e09ada1c75db2800ebc982 static/scripts/utils/galaxy.uploadbox.js
--- a/static/scripts/utils/galaxy.uploadbox.js
+++ b/static/scripts/utils/galaxy.uploadbox.js
@@ -25,7 +25,8 @@
error_default : "Please make sure the file is available.",
error_server : "Upload request failed.",
error_toomany : "You can only queue <20 files per upload session.",
- error_login : "Uploads require you to log in."
+ error_login : "Uploads require you to log in.",
+ error_missing : "No upload content available."
}
// options
@@ -42,7 +43,7 @@
// indicates if queue is currently running
var queue_running = false;
- var queue_pause = false;
+ var queue_stop = false;
// element
var el = null;
@@ -159,21 +160,15 @@
// process an upload, recursive
function process()
{
- // log
- //console.log("Processing queue..." + queue_length + " (" + queue_running + " / " + queue_pause + ")");
-
// validate
- if (queue_length == 0 || queue_pause)
+ if (queue_length == 0 || queue_stop)
{
- queue_pause = false;
+ queue_stop = false;
queue_running = false;
opts.complete();
return;
} else
queue_running = true;
-
- // log
- //console.log("Looking for file...");
// get an identifier from the queue
var index = -1;
@@ -188,9 +183,6 @@
// remove from queue
remove(index)
-
- // log
- //console.log("Initializing ('" + file.name + "').");
// identify maximum file size
var filesize = file.size;
@@ -199,8 +191,14 @@
// check file size
if (filesize < maxfilesize)
{
- // send data
- send(index, file, opts.initialize(index, file))
+ // get parameters
+ var data = opts.initialize(index, file);
+
+ // validate
+ if (data)
+ send(index, file, data);
+ else
+ error(index, file, opts.error_missing);
} else {
// skip file
error(index, file, opts.error_filesize);
@@ -214,7 +212,10 @@
var formData = new FormData();
for (var key in data)
formData.append(key, data[key]);
- formData.append(opts.paramname, file, file.name);
+
+ // check file size
+ if (file.size > 0)
+ formData.append(opts.paramname, file, file.name);
// prepare request
xhr = new XMLHttpRequest();
@@ -226,9 +227,6 @@
// captures state changes
xhr.onreadystatechange = function()
{
- // status change
- //console.log("Status changed: " + xhr.readyState + ".");
-
// check for request completed, server connection closed
if (xhr.readyState != xhr.DONE)
return;
@@ -271,9 +269,6 @@
// send request
xhr.send(formData);
-
- // sending file
- //console.log("Sending file ('" + file.name + "').");
}
// success
@@ -314,7 +309,7 @@
}
// initiate upload process
- function upload()
+ function start()
{
if (!queue_running)
{
@@ -323,11 +318,11 @@
}
}
- // pause upload process
- function pause()
+ // stop upload process
+ function stop()
{
- // request pause
- queue_pause = true;
+ // request stop
+ queue_stop = true;
}
// set options
@@ -349,9 +344,10 @@
// export functions
return {
'select' : select,
+ 'add' : add,
'remove' : remove,
- 'upload' : upload,
- 'pause' : pause,
+ 'start' : start,
+ 'stop' : stop,
'reset' : reset,
'configure' : configure,
'compatible' : compatible
diff -r ecd1fe4c471d2cf9ede923c43f27d4e380f961d1 -r f170cf78702b3913a5e09ada1c75db2800ebc982 static/style/blue/base.css
--- a/static/style/blue/base.css
+++ b/static/style/blue/base.css
@@ -1129,6 +1129,8 @@
.upload-box .table th{text-align:center;white-space:nowrap}
.upload-box .table td{margin:0px;paddign:0px}
.upload-box .title{width:130px;word-wrap:break-word;font-size:11px}
+.upload-box .text{position:absolute;display:none}.upload-box .text .text-content{font-size:11px;width:100%;height:50px;resize:none}
+.upload-box .text .text-info{font-size:11px;color:#999}
.upload-box .extension{width:100px;font-size:11px}
.upload-box .genome{width:150px;font-size:11px}
.upload-box .size{width:60px;white-space:nowrap}
diff -r ecd1fe4c471d2cf9ede923c43f27d4e380f961d1 -r f170cf78702b3913a5e09ada1c75db2800ebc982 static/style/src/less/upload.less
--- a/static/style/src/less/upload.less
+++ b/static/style/src/less/upload.less
@@ -39,6 +39,23 @@
font-size : @font-size-small;
}
+ .text {
+ position: absolute;
+ display: none;
+
+ .text-content {
+ font-size : @font-size-small;
+ width : 100%;
+ height : 50px;
+ resize : none;
+ }
+
+ .text-info {
+ font-size : @font-size-small;
+ color : @gray-light;
+ }
+ }
+
.extension {
width: 100px;
font-size : @font-size-small;
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: guerler: Sort genomes in genome selector
by commits-noreply@bitbucket.org 15 Oct '13
by commits-noreply@bitbucket.org 15 Oct '13
15 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/ecd1fe4c471d/
Changeset: ecd1fe4c471d
User: guerler
Date: 2013-10-16 05:40:46
Summary: Sort genomes in genome selector
Affected #: 1 file
diff -r 443c97b1017eae9374ab91c0d87a19252db8a0e0 -r ecd1fe4c471d2cf9ede923c43f27d4e380f961d1 static/scripts/galaxy.upload.js
--- a/static/scripts/galaxy.upload.js
+++ b/static/scripts/galaxy.upload.js
@@ -3,7 +3,7 @@
*/
// dependencies
-define(["galaxy.modal", "galaxy.master", "utils/galaxy.utils", "utils/galaxy.uploadbox", "libs/backbone/backbone-relational"], function(mod_modal, mod_master, mod_util) {
+define(["galaxy.modal", "galaxy.master", "utils/galaxy.utils", "utils/galaxy.uploadbox", "libs/backbone/backbone-relational"], function(mod_modal, mod_master, mod_utils) {
// galaxy upload
var GalaxyUpload = Backbone.View.extend(
@@ -18,14 +18,10 @@
uploadbox: null,
// extension types
- select_extension : {
- 'auto' : 'Auto-detect'
- },
+ select_extension :[['Auto-detect', 'auto']],
// genomes
- select_genome : {
- '?' : 'Unspecified'
- },
+ select_genome : [['Unspecified (?)', '?']],
// states
state : {
@@ -85,17 +81,32 @@
// load extension
var self = this;
- mod_util.jsonFromUrl(galaxy_config.root + "api/datatypes",
+ mod_utils.jsonFromUrl(galaxy_config.root + "api/datatypes",
function(datatypes) {
for (key in datatypes)
- self.select_extension[datatypes[key]] = datatypes[key];
+ self.select_extension.push([datatypes[key], datatypes[key]]);
});
// load genomes
- mod_util.jsonFromUrl(galaxy_config.root + "api/genomes",
+ mod_utils.jsonFromUrl(galaxy_config.root + "api/genomes",
function(genomes) {
+ // backup default
+ var def = self.select_genome[0];
+
+ // fill array
+ self.select_genome = [];
for (key in genomes)
- self.select_genome[genomes[key][1]] = genomes[key][0];
+ if (genomes[key].length > 1)
+ if (genomes[key][1] !== def[1])
+ self.select_genome.push(genomes[key]);
+
+ // sort
+ self.select_genome.sort(function(a, b) {
+ return a[0] > b[0] ? 1 : a[0] < b[0] ? -1 : 0;
+ });
+
+ // insert default back to array
+ self.select_genome.unshift(def);
});
},
@@ -565,7 +576,7 @@
tmpl += '<td>' +
'<select id="extension" class="extension">';
for (key in this.select_extension)
- tmpl += '<option value="' + key + '">' + this.select_extension[key] + '</option>';
+ tmpl += '<option value="' + this.select_extension[key][1] + '">' + this.select_extension[key][0] + '</option>';
tmpl += '</select>' +
'</td>';
@@ -573,7 +584,7 @@
tmpl += '<td>' +
'<select id="genome" class="genome">';
for (key in this.select_genome)
- tmpl += '<option value="' + key + '">' + this.select_genome[key] + '</option>';
+ tmpl += '<option value="' + this.select_genome[key][1] + '">' + this.select_genome[key][0] + '</option>';
tmpl += '</select>' +
'</td>';
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: guerler: Add genome selector to file upload
by commits-noreply@bitbucket.org 15 Oct '13
by commits-noreply@bitbucket.org 15 Oct '13
15 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/443c97b1017e/
Changeset: 443c97b1017e
User: guerler
Date: 2013-10-16 03:04:47
Summary: Add genome selector to file upload
Affected #: 4 files
diff -r 9da85b79e0000f83e0232c06188ce89fd0e969a8 -r 443c97b1017eae9374ab91c0d87a19252db8a0e0 static/scripts/galaxy.modal.js
--- a/static/scripts/galaxy.modal.js
+++ b/static/scripts/galaxy.modal.js
@@ -15,7 +15,9 @@
optionsDefault: {
title : "galaxy-modal",
body : "",
- backdrop : true
+ backdrop : true,
+ height : null,
+ width : null
},
// options
@@ -23,7 +25,6 @@
// initialize
initialize : function(options) {
- // create
if (options)
this.create(options);
},
@@ -32,18 +33,18 @@
show: function(options) {
// create
this.initialize(options);
+
+ // fix height
+ if (this.options.height)
+ {
+ this.$body.css('height', this.options.height);
+ this.$body.css('overflow', 'hidden');
+ } else
+ this.$body.css('max-height', $(window).height() / 2);
- // fix height
- var maxHeight = $(document).height() / 2;
- this.$body.css('max-height', maxHeight);
-
- // fix height if available
- if (this.options.height)
- this.$body.css('height', Math.min(this.options.height, maxHeight));
-
- // remove scroll bar
- if (this.options.height)
- this.$body.css('overflow', 'hidden');
+ // fix width
+ if (this.options.width)
+ this.$dialog.css('width', this.options.width);
// show
if (this.visible)
@@ -81,6 +82,7 @@
this.setElement(this.template(this.options.title));
// link elements
+ this.$dialog = (this.$el).find('.modal-dialog');
this.$body = (this.$el).find('.modal-body');
this.$footer = (this.$el).find('.modal-footer');
this.$buttons = (this.$el).find('.buttons');
@@ -89,9 +91,6 @@
// append body
this.$body.html(this.options.body);
- // fix min-width so that modal cannot shrink considerably if new content is loaded.
- this.$body.css('min-width', this.$body.width());
-
// configure background
if (!this.options.backdrop)
this.$backdrop.removeClass('in');
diff -r 9da85b79e0000f83e0232c06188ce89fd0e969a8 -r 443c97b1017eae9374ab91c0d87a19252db8a0e0 static/scripts/galaxy.upload.js
--- a/static/scripts/galaxy.upload.js
+++ b/static/scripts/galaxy.upload.js
@@ -22,6 +22,11 @@
'auto' : 'Auto-detect'
},
+ // genomes
+ select_genome : {
+ '?' : 'Unspecified'
+ },
+
// states
state : {
init : 'fa-icon-trash',
@@ -82,10 +87,15 @@
var self = this;
mod_util.jsonFromUrl(galaxy_config.root + "api/datatypes",
function(datatypes) {
- for (key in datatypes) {
- var filetype = datatypes[key];
- self.select_extension[filetype] = filetype;
- }
+ for (key in datatypes)
+ self.select_extension[datatypes[key]] = datatypes[key];
+ });
+
+ // load genomes
+ mod_util.jsonFromUrl(galaxy_config.root + "api/genomes",
+ function(genomes) {
+ for (key in genomes)
+ self.select_genome[genomes[key][1]] = genomes[key][0];
});
},
@@ -106,7 +116,7 @@
var id = '#upload-' + index;
// add upload item
- $(this.el).find('tbody:last').append(this.template_row(id, this.select_extension));
+ $(this.el).find('tbody:last').append(this.template_row(id));
// scroll to bottom
//$(this.el).scrollTop($(this.el).prop('scrollHeight'));
@@ -153,6 +163,7 @@
// get configuration
var current_history = Galaxy.currHistoryPanel.model.get('id');
var file_type = it.find('#extension').val();
+ var genome = it.find('#genome').val();
var space_to_tabs = it.find('#space_to_tabs').is(':checked');
// configure uploadbox
@@ -160,7 +171,7 @@
// configure tool
tool_input = {};
- tool_input['dbkey'] = '?';
+ tool_input['dbkey'] = genome;
tool_input['file_type'] = file_type;
tool_input['files_0|NAME'] = file.name;
tool_input['files_0|type'] = 'upload_dataset';
@@ -285,6 +296,7 @@
symbol.addClass(self.state.queued);
// disable options
+ $(this).find('#genome').attr('disabled', true);
$(this).find('#extension').attr('disabled', true);
$(this).find('#space_to_tabs').attr('disabled', true);
}
@@ -332,6 +344,7 @@
symbol.addClass(self.state.init);
// disable options
+ $(this).find('#genome').attr('disabled', false);
$(this).find('#extension').attr('disabled', false);
$(this).find('#space_to_tabs').attr('disabled', false);
}
@@ -410,7 +423,8 @@
'Reset' : function() {self.event_reset()},
'Close' : function() {self.modal.hide()}
},
- height : '350'
+ height : '350',
+ width : '850'
});
// set element
@@ -528,6 +542,7 @@
'<th>Name</th>' +
'<th>Size</th>' +
'<th>Type</th>' +
+ '<th>Genome</th>' +
'<th>Space→Tab</th>' +
'<th>Status</th>' +
'<th></th>' +
@@ -539,22 +554,31 @@
'<h6 id="' + idInfo + '" class="upload-info"></h6>';
},
- template_row: function(id, select_extension)
+ template_row: function(id)
{
// construct template
var tmpl = '<tr id="' + id.substr(1) + '" class="upload-item">' +
'<td><div id="title" class="title"></div></td>' +
- '<td><div id="size" class="size"></div></td>' +
- '<td>' +
+ '<td><div id="size" class="size"></div></td>';
+
+ // add file type selectore
+ tmpl += '<td>' +
'<select id="extension" class="extension">';
+ for (key in this.select_extension)
+ tmpl += '<option value="' + key + '">' + this.select_extension[key] + '</option>';
+ tmpl += '</select>' +
+ '</td>';
- // add file types to selection
- for (key in select_extension)
- tmpl += '<option value="' + key + '">' + select_extension[key] + '</option>';
-
+ // add genome selector
+ tmpl += '<td>' +
+ '<select id="genome" class="genome">';
+ for (key in this.select_genome)
+ tmpl += '<option value="' + key + '">' + this.select_genome[key] + '</option>';
tmpl += '</select>' +
- '</td>' +
- '<td><input id="space_to_tabs" type="checkbox"></input></td>' +
+ '</td>';
+
+ // add next row
+ tmpl += '<td><input id="space_to_tabs" type="checkbox"></input></td>' +
'<td>' +
'<div id="info" class="info">' +
'<div class="progress">' +
diff -r 9da85b79e0000f83e0232c06188ce89fd0e969a8 -r 443c97b1017eae9374ab91c0d87a19252db8a0e0 static/style/blue/base.css
--- a/static/style/blue/base.css
+++ b/static/style/blue/base.css
@@ -1130,6 +1130,7 @@
.upload-box .table td{margin:0px;paddign:0px}
.upload-box .title{width:130px;word-wrap:break-word;font-size:11px}
.upload-box .extension{width:100px;font-size:11px}
+.upload-box .genome{width:150px;font-size:11px}
.upload-box .size{width:60px;white-space:nowrap}
.upload-box .info{width:130px;font-size:11px}.upload-box .info .progress{top:1px;position:relative;width:100%;padding:0px;margin:0px}.upload-box .info .progress .progress-bar{border-radius:inherit;-moz-border-radius:inherit}
.upload-box .info .progress .percentage{position:absolute;text-align:center;width:100%;color:#fff}
diff -r 9da85b79e0000f83e0232c06188ce89fd0e969a8 -r 443c97b1017eae9374ab91c0d87a19252db8a0e0 static/style/src/less/upload.less
--- a/static/style/src/less/upload.less
+++ b/static/style/src/less/upload.less
@@ -44,6 +44,11 @@
font-size : @font-size-small;
}
+ .genome {
+ width: 150px;
+ font-size : @font-size-small;
+ }
+
.size {
width: 60px;
white-space: nowrap;
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
6 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/9ce5afd13d9a/
Changeset: 9ce5afd13d9a
Branch: action_tpye_autoconf
User: BjoernGruening
Date: 2013-09-16 12:07:44
Summary: Add autoconf tag. ./configure && make && make install in one command.
If 'prefix=' is not set, use prefix=$INSTALL_DIR as default
Affected #: 2 files
diff -r 5b17d89780b381bfd53712f964e0b53137adc322 -r 9ce5afd13d9a1741d911474cea4d7c46ef1f497d lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -409,6 +409,20 @@
return_code = handle_command( app, tool_dependency, install_dir, cmd )
if return_code:
return
+ elif action_type == 'autoconf':
+ # Handle configure, make and make install allow providing configuration options
+ with settings( warn_only=True ):
+ configure_opts = action_dict.get( 'configure_opts', '' )
+ if 'prefix=' in configure_opts:
+ pre_cmd = './configure %s && make && make install' % configure_opts
+ else:
+ pre_cmd = './configure prefix=$INSTALL_DIR %s && make && make install' % configure_opts
+
+ cmd = install_environment.build_command( common_util.evaluate_template( pre_cmd, install_dir ) )
+ log.warning(cmd)
+ return_code = handle_command( app, tool_dependency, install_dir, cmd )
+ if return_code:
+ return
elif action_type == 'download_file':
# Download a single file to the current working directory.
url = action_dict[ 'url' ]
diff -r 5b17d89780b381bfd53712f964e0b53137adc322 -r 9ce5afd13d9a1741d911474cea4d7c46ef1f497d lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -602,6 +602,11 @@
# lxml==2.3.0</action>
## Manually specify contents of requirements.txt file to create dynamically.
action_dict[ 'requirements' ] = td_common_util.evaluate_template( action_elem.text or 'requirements.txt', install_dir )
+ elif action_type == 'autoconf':
+ # Handle configure, make and make install allow providing configuration options
+ if action_elem.text:
+ configure_opts = td_common_util.evaluate_template( action_elem.text, install_dir )
+ action_dict[ 'configure_opts' ] = configure_opts
elif action_type == 'chmod':
# Change the read, write, and execute bits on a file.
# <action type="chmod">
https://bitbucket.org/galaxy/galaxy-central/commits/ac77cc9c867e/
Changeset: ac77cc9c867e
Branch: action_tpye_autoconf
User: BjoernGruening
Date: 2013-09-16 15:24:12
Summary: merge
Affected #: 1 file
diff -r 9ce5afd13d9a1741d911474cea4d7c46ef1f497d -r ac77cc9c867e300c801281195e281beae936633a lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -393,7 +393,7 @@
return
elif action_type == 'shell_command':
with settings( warn_only=True ):
- install_environment.add_env_shell_file_paths( action_dict[ 'env_shell_file_paths' ] )
+ cmd = install_environment.build_command( action_dict[ 'command' ] )
return_code = handle_command( app, tool_dependency, install_dir, cmd )
if return_code:
return
https://bitbucket.org/galaxy/galaxy-central/commits/1ceb38304621/
Changeset: 1ceb38304621
Branch: action_tpye_autoconf
User: BjoernGruening
Date: 2013-09-16 15:35:39
Summary: missed a port from common_util to td_common_util
Affected #: 1 file
diff -r ac77cc9c867e300c801281195e281beae936633a -r 1ceb38304621afa96c38cef52286b889f4c575bc lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -418,7 +418,7 @@
else:
pre_cmd = './configure prefix=$INSTALL_DIR %s && make && make install' % configure_opts
- cmd = install_environment.build_command( common_util.evaluate_template( pre_cmd, install_dir ) )
+ cmd = install_environment.build_command( td_common_util.evaluate_template( pre_cmd, install_dir ) )
log.warning(cmd)
return_code = handle_command( app, tool_dependency, install_dir, cmd )
if return_code:
https://bitbucket.org/galaxy/galaxy-central/commits/29c686a67dfb/
Changeset: 29c686a67dfb
Branch: action_tpye_autoconf
User: BjoernGruening
Date: 2013-09-16 15:52:41
Summary: Removing debug statement, as pointed out by John.
Affected #: 1 file
diff -r 1ceb38304621afa96c38cef52286b889f4c575bc -r 29c686a67dfb8c97dcf2d14546379b81ac00bd6c lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/fabric_util.py
@@ -419,7 +419,6 @@
pre_cmd = './configure prefix=$INSTALL_DIR %s && make && make install' % configure_opts
cmd = install_environment.build_command( td_common_util.evaluate_template( pre_cmd, install_dir ) )
- log.warning(cmd)
return_code = handle_command( app, tool_dependency, install_dir, cmd )
if return_code:
return
https://bitbucket.org/galaxy/galaxy-central/commits/65158f203ac7/
Changeset: 65158f203ac7
User: BjoernGruening
Date: 2013-09-23 14:09:26
Summary: Merged galaxy/galaxy-central into default
Affected #: 71 files
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a config/plugins/visualizations/graphview/config/graphview.xml
--- a/config/plugins/visualizations/graphview/config/graphview.xml
+++ b/config/plugins/visualizations/graphview/config/graphview.xml
@@ -16,5 +16,5 @@
<params><param type="dataset" var_name_in_template="hda" required="true">dataset_id</param></params>
- <template>graphview/templates/graphview.mako</template>
+ <template>graphview.mako</template></visualization>
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a config/plugins/visualizations/scatterplot/config/scatterplot.xml
--- a/config/plugins/visualizations/scatterplot/config/scatterplot.xml
+++ b/config/plugins/visualizations/scatterplot/config/scatterplot.xml
@@ -11,5 +11,5 @@
<params><param type="dataset" var_name_in_template="hda" required="true">dataset_id</param></params>
- <template>scatterplot/templates/scatterplot.mako</template>
+ <template>scatterplot.mako</template></visualization>
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/app.py
--- a/lib/galaxy/app.py
+++ b/lib/galaxy/app.py
@@ -22,6 +22,8 @@
from galaxy.openid.providers import OpenIDProviders
from galaxy.tools.data_manager.manager import DataManagers
+from galaxy.web.base import pluginframework
+
import logging
log = logging.getLogger( __name__ )
@@ -123,8 +125,11 @@
# Load genome indexer tool.
load_genome_index_tools( self.toolbox )
# visualizations registry: associates resources with visualizations, controls how to render
- self.visualizations_registry = VisualizationsRegistry.from_config(
- self.config.visualizations_plugins_directory, self.config )
+ self.visualizations_registry = None
+ if self.config.visualizations_plugins_directory:
+ self.visualizations_registry = VisualizationsRegistry( self,
+ directories_setting=self.config.visualizations_plugins_directory,
+ template_cache_dir=self.config.template_cache )
# Load security policy.
self.security_agent = self.model.security_agent
self.host_security_agent = galaxy.security.HostAgent( model=self.security_agent.model, permitted_actions=self.security_agent.permitted_actions )
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/config.py
--- a/lib/galaxy/config.py
+++ b/lib/galaxy/config.py
@@ -209,6 +209,7 @@
if self.nginx_upload_store:
self.nginx_upload_store = os.path.abspath( self.nginx_upload_store )
self.object_store = kwargs.get( 'object_store', 'disk' )
+ self.object_store_check_old_style = string_as_bool( kwargs.get( 'object_store_check_old_style', False ) )
self.object_store_cache_path = resolve_path( kwargs.get( "object_store_cache_path", "database/object_store_cache" ), self.root )
# Handle AWS-specific config options for backward compatibility
if kwargs.get( 'aws_access_key', None) is not None:
@@ -294,9 +295,7 @@
self.fluent_log = string_as_bool( kwargs.get( 'fluent_log', False ) )
self.fluent_host = kwargs.get( 'fluent_host', 'localhost' )
self.fluent_port = int( kwargs.get( 'fluent_port', 24224 ) )
- # PLUGINS:
- self.plugin_frameworks = []
- # visualization framework
+ # visualization plugin framework
self.visualizations_plugins_directory = kwargs.get( 'visualizations_plugins_directory', None )
@property
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -25,7 +25,7 @@
from galaxy.datatypes.metadata import MetadataCollection
from galaxy.model.item_attrs import Dictifiable, UsesAnnotations
from galaxy.security import get_permitted_actions
-from galaxy.util import is_multi_byte, nice_size, Params, restore_text, send_mail
+from galaxy.util import asbool, is_multi_byte, nice_size, Params, restore_text, send_mail
from galaxy.util.bunch import Bunch
from galaxy.util.hash_util import new_secure_hash
from galaxy.web.framework.helpers import to_unicode
@@ -34,6 +34,7 @@
WorkflowMappingField)
from sqlalchemy.orm import object_session
from sqlalchemy.sql.expression import func
+from tool_shed.util import common_util
log = logging.getLogger( __name__ )
@@ -2416,7 +2417,8 @@
da = self.history_dataset or self.library_dataset
if self.object_store_id is None and da is not None:
self.object_store_id = da.dataset.object_store_id
- da.dataset.object_store.create( self, extra_dir='_metadata_files', extra_dir_at_root=True, alt_name="metadata_%d.dat" % self.id )
+ if not da.dataset.object_store.exists( self, extra_dir='_metadata_files', extra_dir_at_root=True, alt_name="metadata_%d.dat" % self.id ):
+ da.dataset.object_store.create( self, extra_dir='_metadata_files', extra_dir_at_root=True, alt_name="metadata_%d.dat" % self.id )
path = da.dataset.object_store.get_filename( self, extra_dir='_metadata_files', extra_dir_at_root=True, alt_name="metadata_%d.dat" % self.id )
return path
except AttributeError:
@@ -3462,7 +3464,28 @@
@property
def has_repository_dependencies( self ):
if self.metadata:
- return 'repository_dependencies' in self.metadata
+ repository_dependencies_dict = self.metadata.get( 'repository_dependencies', {} )
+ repository_dependencies = repository_dependencies_dict.get( 'repository_dependencies', [] )
+ # [["http://localhost:9009", "package_libgtextutils_0_6", "test", "e2003cbf18cd", "True", "True"]]
+ for rd_tup in repository_dependencies:
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( rd_tup )
+ if not asbool( only_if_compiling_contained_td ):
+ return True
+ return False
+
+ @property
+ def has_repository_dependencies_only_if_compiling_contained_td( self ):
+ if self.metadata:
+ repository_dependencies_dict = self.metadata.get( 'repository_dependencies', {} )
+ repository_dependencies = repository_dependencies_dict.get( 'repository_dependencies', [] )
+ # [["http://localhost:9009", "package_libgtextutils_0_6", "test", "e2003cbf18cd", "True", "True"]]
+ for rd_tup in repository_dependencies:
+ tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = \
+ common_util.parse_repository_dependency_tuple( rd_tup )
+ if not asbool( only_if_compiling_contained_td ):
+ return False
+ return True
return False
@property
@@ -3694,10 +3717,14 @@
@property
def tuples_of_repository_dependencies_needed_for_compiling_td( self ):
- """Return this repository's repository dependencies that are necessary only for compiling this repository's tool dependencies."""
+ """
+ Return tuples defining this repository's repository dependencies that are necessary only for compiling this repository's tool
+ dependencies.
+ """
rd_tups_of_repositories_needed_for_compiling_td = []
- if self.has_repository_dependencies:
- rd_tups = self.metadata[ 'repository_dependencies' ][ 'repository_dependencies' ]
+ if self.metadata:
+ repository_dependencies = self.metadata.get( 'repository_dependencies', None )
+ rd_tups = repository_dependencies[ 'repository_dependencies' ]
for rd_tup in rd_tups:
if len( rd_tup ) == 6:
tool_shed, name, owner, changeset_revision, prior_installation_required, only_if_compiling_contained_td = rd_tup
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/objectstore/__init__.py
--- a/lib/galaxy/objectstore/__init__.py
+++ b/lib/galaxy/objectstore/__init__.py
@@ -198,6 +198,7 @@
super(DiskObjectStore, self).__init__(config, file_path=file_path, extra_dirs=extra_dirs)
self.file_path = file_path or config.file_path
self.config = config
+ self.check_old_style = config.object_store_check_old_style
self.extra_dirs['job_work'] = config.job_working_directory
self.extra_dirs['temp'] = config.new_file_path
if extra_dirs is not None:
@@ -264,14 +265,13 @@
return os.path.abspath(path)
def exists(self, obj, **kwargs):
- path = self._construct_path(obj, old_style=True, **kwargs)
- # For backward compatibility, check root path first; otherwise, construct
- # and check hashed path
- if os.path.exists(path):
- return True
- else:
- path = self._construct_path(obj, **kwargs)
- return os.path.exists(path)
+ if self.check_old_style:
+ path = self._construct_path(obj, old_style=True, **kwargs)
+ # For backward compatibility, check root path first; otherwise, construct
+ # and check hashed path
+ if os.path.exists(path):
+ return True
+ return os.path.exists(self._construct_path(obj, **kwargs))
def create(self, obj, **kwargs):
if not self.exists(obj, **kwargs):
@@ -320,13 +320,13 @@
return content
def get_filename(self, obj, **kwargs):
- path = self._construct_path(obj, old_style=True, **kwargs)
- # For backward compatibility, check root path first; otherwise, construct
- # and return hashed path
- if os.path.exists(path):
- return path
- else:
- return self._construct_path(obj, **kwargs)
+ if self.check_old_style:
+ path = self._construct_path(obj, old_style=True, **kwargs)
+ # For backward compatibility, check root path first; otherwise, construct
+ # and return hashed path
+ if os.path.exists(path):
+ return path
+ return self._construct_path(obj, **kwargs)
def update_from_file(self, obj, file_name=None, create=False, **kwargs):
""" `create` parameter is not used in this implementation """
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -154,8 +154,24 @@
# This will cover cases where the Galaxy administrator manually edited one or more of the tool panel
# config files, adding or removing locally developed tools or workflows. The value of integrated_tool_panel
# will be False when things like functional tests are the caller.
+ self.fix_integrated_tool_panel_dict()
self.write_integrated_tool_panel_config_file()
+ def fix_integrated_tool_panel_dict( self ):
+ # HACK: instead of fixing after the fact, I suggest some combination of:
+ # 1) adjusting init_tools() and called methods to get this right
+ # 2) redesigning the code and/or data structure used to read/write integrated_tool_panel.xml
+ for key, value in self.integrated_tool_panel.iteritems():
+ if key.startswith( 'section_' ):
+ for section_key, section_value in value.elems.iteritems():
+ if section_value is None:
+ if section_key.startswith( 'tool_' ):
+ tool_id = section_key[5:]
+ value.elems[section_key] = self.tools_by_id.get( tool_id )
+ elif section_key.startswith( 'workflow_' ):
+ workflow_id = section_key[9:]
+ value.elems[section_key] = self.workflows_by_id.get( workflow_id )
+
def init_tools( self, config_filename ):
"""
Read the configuration file and load each tool. The following tags are currently supported:
@@ -1852,7 +1868,7 @@
# TODO: Anyway to capture tools that dynamically change their own
# outputs?
return True
- def new_state( self, trans, all_pages=False ):
+ def new_state( self, trans, all_pages=False, history=None ):
"""
Create a new `DefaultToolState` for this tool. It will be initialized
with default values for inputs.
@@ -1866,16 +1882,16 @@
inputs = self.inputs
else:
inputs = self.inputs_by_page[ 0 ]
- self.fill_in_new_state( trans, inputs, state.inputs )
+ self.fill_in_new_state( trans, inputs, state.inputs, history=history )
return state
- def fill_in_new_state( self, trans, inputs, state, context=None ):
+ def fill_in_new_state( self, trans, inputs, state, context=None, history=None ):
"""
Fill in a tool state dictionary with default values for all parameters
in the dictionary `inputs`. Grouping elements are filled in recursively.
"""
context = ExpressionContext( state, context )
for input in inputs.itervalues():
- state[ input.name ] = input.get_initial_value( trans, context )
+ state[ input.name ] = input.get_initial_value( trans, context, history=history )
def get_param_html_map( self, trans, page=0, other_values={} ):
"""
Return a dictionary containing the HTML representation of each
@@ -1938,7 +1954,7 @@
state = DefaultToolState()
state.decode( encoded_state, self, trans.app )
else:
- state = self.new_state( trans )
+ state = self.new_state( trans, history=history )
# This feels a bit like a hack. It allows forcing full processing
# of inputs even when there is no state in the incoming dictionary
# by providing either 'runtool_btn' (the name of the submit button
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -72,13 +72,13 @@
"""
return value
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
"""
Return the starting value of the parameter
"""
return None
- def get_initial_value_from_history_prevent_repeats( self, trans, context, already_used ):
+ def get_initial_value_from_history_prevent_repeats( self, trans, context, already_used, history=None ):
"""
Get the starting value for the parameter, but if fetching from the history, try
to find a value that has not yet been used. already_used is a list of objects that
@@ -86,7 +86,7 @@
if a value has already been chosen from the history. This is to support the capability to
choose each dataset once
"""
- return self.get_initial_value(trans, context);
+ return self.get_initial_value(trans, context, history=history);
def get_required_enctype( self ):
"""
@@ -216,7 +216,7 @@
return form_builder.TextArea( self.name, self.size, value )
else:
return form_builder.TextField( self.name, self.size, value )
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
return self.value
class IntegerToolParameter( TextToolParameter ):
@@ -286,7 +286,7 @@
if not value and self.optional:
return None
raise err
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
if self.value:
return int( self.value )
else:
@@ -358,7 +358,7 @@
if not value and self.optional:
return None
raise err
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
try:
return float( self.value )
except:
@@ -401,7 +401,7 @@
return [ 'true' ]
def to_python( self, value, app ):
return ( value == 'True' )
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
return self.checked
def to_param_dict_string( self, value, other_values={} ):
if value:
@@ -474,7 +474,7 @@
return value
else:
raise Exception( "FileToolParameter cannot be persisted" )
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
return None
class FTPFileToolParameter( ToolParameter ):
@@ -513,7 +513,7 @@
return None
elif isinstance( value, unicode ) or isinstance( value, str ) or isinstance( value, list ):
return value
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
return None
class HiddenToolParameter( ToolParameter ):
@@ -534,7 +534,7 @@
self.value = elem.get( 'value' )
def get_html_field( self, trans=None, value=None, other_values={} ):
return form_builder.HiddenField( self.name, self.value )
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
return self.value
def get_label( self ):
return None
@@ -557,7 +557,7 @@
return url
def get_html_field( self, trans=None, value=None, other_values={} ):
return form_builder.HiddenField( self.name, self.get_value( trans ) )
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
return self.value
def get_label( self ):
# BaseURLToolParameters are ultimately "hidden" parameters
@@ -826,7 +826,7 @@
return True
# Dynamic, but all dependenceis are known and have values
return False
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
# More working around dynamic options for workflow
if self.need_late_validation( trans, context ):
# Really the best we can do?
@@ -1074,7 +1074,7 @@
options.append( ( 'c' + col, col, False ) )
return options
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
if self.default_value is not None:
# dataset not ready / in workflow / etc
if self.need_late_validation( trans, context ):
@@ -1353,8 +1353,7 @@
else:
rval = sanitize_param( rval )
return rval
-
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
def recurse_options( initial_values, options ):
for option in options:
if option['selected']:
@@ -1542,10 +1541,10 @@
field.add_option( "Selection is Optional", 'None', False )
return field
- def get_initial_value( self, trans, context ):
- return self.get_initial_value_from_history_prevent_repeats(trans, context, None);
+ def get_initial_value( self, trans, context, history=None ):
+ return self.get_initial_value_from_history_prevent_repeats(trans, context, None, history=history);
- def get_initial_value_from_history_prevent_repeats( self, trans, context, already_used ):
+ def get_initial_value_from_history_prevent_repeats( self, trans, context, already_used, history=None ):
"""
NOTE: This is wasteful since dynamic options and dataset collection
happens twice (here and when generating HTML).
@@ -1554,7 +1553,8 @@
if trans is None or trans.workflow_building_mode or trans.webapp.name == 'tool_shed':
return DummyDataset()
assert trans is not None, "DataToolParameter requires a trans"
- history = trans.get_history()
+ if history is None:
+ history = trans.get_history()
assert history is not None, "DataToolParameter requires a history"
if self.optional:
return None
@@ -1613,6 +1613,9 @@
rval = [ trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( v ) for v in value ]
elif isinstance( value, trans.app.model.HistoryDatasetAssociation ):
rval = value
+ elif isinstance( value, dict ) and 'src' in value and 'id' in value:
+ if value['src'] == 'hda':
+ rval = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( trans.app.security.decode_id(value['id']) )
else:
rval = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( value )
if isinstance( rval, list ):
@@ -1727,7 +1730,7 @@
self.value = "None"
self.type = "hidden_data"
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
return None
def get_html_field( self, trans=None, value=None, other_values={} ):
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/tools/parameters/grouping.py
--- a/lib/galaxy/tools/parameters/grouping.py
+++ b/lib/galaxy/tools/parameters/grouping.py
@@ -41,8 +41,7 @@
into the preferred value form.
"""
return value
-
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
"""
Return the initial state/value for this group
"""
@@ -109,7 +108,7 @@
callback( new_prefix, input, d[input.name], parent = d )
else:
input.visit_inputs( new_prefix, d[input.name], callback )
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
rval = []
for i in range( self.default ):
rval_dict = { '__index__': i}
@@ -202,14 +201,14 @@
callback( new_prefix, input, d[input.name], parent = d )
else:
input.visit_inputs( new_prefix, d[input.name], callback )
- def get_initial_value( self, trans, context ):
+ def get_initial_value( self, trans, context, history=None ):
d_type = self.get_datatype( trans, context )
rval = []
for i, ( composite_name, composite_file ) in enumerate( d_type.writable_files.iteritems() ):
rval_dict = {}
rval_dict['__index__'] = i # create __index__
for input in self.inputs.itervalues():
- rval_dict[ input.name ] = input.get_initial_value( trans, context ) #input.value_to_basic( d[input.name], app )
+ rval_dict[ input.name ] = input.get_initial_value( trans, context, history=history ) #input.value_to_basic( d[input.name], app )
rval.append( rval_dict )
return rval
def get_uploaded_datasets( self, trans, context, override_name = None, override_info = None ):
@@ -489,12 +488,12 @@
callback( prefix, input, value[input.name], parent = value )
else:
input.visit_inputs( prefix, value[input.name], callback )
- def get_initial_value( self, trans, context ):
- # State for a conditional is a plain dictionary.
+ def get_initial_value( self, trans, context, history=None ):
+ # State for a conditional is a plain dictionary.
rval = {}
# Get the default value for the 'test element' and use it
# to determine the current case
- test_value = self.test_param.get_initial_value( trans, context )
+ test_value = self.test_param.get_initial_value( trans, context, history=None )
current_case = self.get_current_case( test_value, trans )
# Store the current case in a special value
rval['__current_case__'] = current_case
@@ -503,7 +502,7 @@
# Fill in state for selected case
child_context = ExpressionContext( rval, context )
for child_input in self.cases[current_case].inputs.itervalues():
- rval[ child_input.name ] = child_input.get_initial_value( trans, child_context )
+ rval[ child_input.name ] = child_input.get_initial_value( trans, child_context, history=None )
return rval
class ConditionalWhen( object ):
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/visualization/registry.py
--- a/lib/galaxy/visualization/registry.py
+++ b/lib/galaxy/visualization/registry.py
@@ -39,9 +39,10 @@
"""
# ------------------------------------------------------------------- the registry
-class VisualizationsRegistry( pluginframework.PluginFramework ):
+class VisualizationsRegistry( pluginframework.PageServingPluginManager ):
"""
Main responsibilities are:
+ - discovering visualization plugins in the filesystem
- testing if an object has a visualization that can be applied to it
- generating a link to controllers.visualization.render with
the appropriate params
@@ -58,31 +59,81 @@
'sweepster',
'phyloviz'
]
- #: directories under plugin_directory that aren't plugins
- non_plugin_directories = []
- def __init__( self, registry_filepath, template_cache_dir ):
- super( VisualizationsRegistry, self ).__init__( registry_filepath, 'visualizations', template_cache_dir )
+ def __str__( self ):
+ return self.__class__.__name__
+ def __init__( self, app, **kwargs ):
+ self.config_parser = VisualizationsConfigParser()
+ super( VisualizationsRegistry, self ).__init__( app, 'visualizations', **kwargs )
# what to use to parse query strings into resources/vars for the template
self.resource_parser = ResourceParser()
log.debug( '%s loaded', str( self ) )
- def load_configuration( self ):
+ def is_plugin( self, plugin_path ):
"""
- Builds the registry by parsing the `config/*.xml` files for every plugin
- in ``get_plugin_directories`` and stores the results in ``self.listings``.
+ Determines whether the given filesystem path contains a plugin.
- ..note::
- This could be used to re-load a new configuration without restarting
- the instance.
+ In this base class, all sub-directories are considered plugins.
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the
+ potential plugin
+ :rtype: bool
+ :returns: True if the path contains a plugin
"""
- try:
- self.listings = VisualizationsConfigParser.parse( self.get_plugin_directories() )
+ # plugin_path must be a directory, have a config dir, and a config file matching the plugin dir name
+ if not os.path.isdir( plugin_path ):
+ # super won't work here - different criteria
+ return False
+ if not 'config' in os.listdir( plugin_path ):
+ return False
+ expected_config_filename = '%s.xml' %( os.path.split( plugin_path )[1] )
+ if not os.path.isfile( os.path.join( plugin_path, 'config', expected_config_filename ) ):
+ return False
+ return True
- except Exception, exc:
- log.exception( 'Error parsing visualizations plugins %s', self.plugin_directories )
- raise
+ def load_plugin( self, plugin_path ):
+ """
+ Create the visualization plugin object, parse its configuration file,
+ and return it.
+
+ Plugin bunches are decorated with:
+ * config_file : the path to this visualization's config file
+ * config : the parsed configuration for this visualization
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the plugin
+ :rtype: ``util.bunch.Bunch``
+ :returns: the loaded plugin object
+ """
+ #TODO: possibly move this after the config parsing to allow config to override?
+ plugin = super( VisualizationsRegistry, self ).load_plugin( plugin_path )
+
+ # config file is required, otherwise skip this visualization
+ plugin[ 'config_file' ] = os.path.join( plugin_path, 'config', ( plugin.name + '.xml' ) )
+ config = self.config_parser.parse_file( plugin.config_file )
+
+ if not config:
+ return None
+ plugin[ 'config' ] = config
+
+ return plugin
+
+ # -- getting resources for visualization templates from link query strings --
+ # -- building links to visualizations from objects --
+ def get_visualizations( self, trans, target_object ):
+ """
+ Get the names of visualizations usable on the `target_object` and
+ the urls to call in order to render the visualizations.
+ """
+ #TODO:?? a list of objects? YAGNI?
+ applicable_visualizations = []
+ for vis_name in self.plugins:
+ url_data = self.get_visualization( trans, vis_name, target_object )
+ if url_data:
+ applicable_visualizations.append( url_data )
+ return applicable_visualizations
def get_visualization( self, trans, visualization_name, target_object ):
"""
@@ -90,12 +141,11 @@
`visualization_name` if it's applicable to `target_object` or
`None` if it's not.
"""
- # a little weird to pass trans because this registry is part of the trans.app
- listing_data = self.listings.get( visualization_name, None )
- if not listing_data:
+ visualization = self.plugins.get( visualization_name, None )
+ if not visualization:
return None
- data_sources = listing_data[ 'data_sources' ]
+ data_sources = visualization.config[ 'data_sources' ]
for data_source in data_sources:
# currently a model class is required
model_class = data_source[ 'model_class' ]
@@ -109,11 +159,11 @@
param_data = data_source[ 'to_params' ]
url = self.get_visualization_url( trans, target_object, visualization_name, param_data )
- link_text = listing_data.get( 'link_text', None )
+ link_text = visualization.config.get( 'link_text', None )
if not link_text:
# default to visualization name, titlecase, and replace underscores
link_text = visualization_name.title().replace( '_', ' ' )
- render_location = listing_data.get( 'render_location' )
+ render_location = visualization.config.get( 'render_location' )
# remap some of these vars for direct use in ui.js, PopupMenu (e.g. text->html)
return {
'href' : url,
@@ -123,25 +173,12 @@
return None
- # -- building links to visualizations from objects --
- def get_visualizations( self, trans, target_object ):
- """
- Get the names of visualizations usable on the `target_object` and
- the urls to call in order to render the visualizations.
- """
- #TODO:?? a list of objects? YAGNI?
- applicable_visualizations = []
- for vis_name in self.listings:
- url_data = self.get_visualization( trans, vis_name, target_object )
- if url_data:
- applicable_visualizations.append( url_data )
- return applicable_visualizations
-
def is_object_applicable( self, trans, target_object, data_source_tests ):
"""
Run a visualization's data_source tests to find out if
it be applied to the target_object.
"""
+ #log.debug( 'is_object_applicable( self, trans, %s, %s )', target_object, data_source_tests )
for test in data_source_tests:
test_type = test[ 'type' ]
result_type = test[ 'result_type' ]
@@ -164,7 +201,6 @@
if test_fn( target_object, test_result ):
#log.debug( 'test passed' )
return True
-
return False
def get_visualization_url( self, trans, target_object, visualization_name, param_data ):
@@ -219,9 +255,9 @@
Both `params` and `param_modifiers` default to an empty dictionary.
"""
- visualization = self.listings.get( visualization_name )
- expected_params = visualization.get( 'params', {} )
- param_modifiers = visualization.get( 'param_modifiers', {} )
+ visualization = self.plugins.get( visualization_name )
+ expected_params = visualization.config.get( 'params', {} )
+ param_modifiers = visualization.config.get( 'param_modifiers', {} )
return ( expected_params, param_modifiers )
def query_dict_to_resources( self, trans, controller, visualization_name, query_dict ):
@@ -259,13 +295,6 @@
"""
VALID_RENDER_LOCATIONS = [ 'galaxy_main', '_top', '_blank' ]
- @classmethod
- def parse( cls, plugin_directories, debug=False ):
- """
- Static class interface.
- """
- return cls( debug ).parse_plugins( plugin_directories )
-
def __init__( self, debug=False ):
self.debug = debug
@@ -274,58 +303,19 @@
self.param_parser = ParamParser()
self.param_modifier_parser = ParamModifierParser()
- def parse_plugins( self, plugin_directories ):
- """
- Parses the config files for each plugin sub-dir in `base_path`.
-
- :param plugin_directories: a list of paths to enabled plugins.
-
- :returns: registry data in dictionary form
- """
- returned = {}
- for plugin_path in plugin_directories:
- returned.update( self.parse_plugin( plugin_path ) )
- return returned
-
- def parse_plugin( self, plugin_path ):
- """
- Parses any XML files in ``<plugin_path>/config``.
-
- If an error occurs while parsing a visualizations entry, it is skipped.
- :returns: registry data in dictionary form
- ..note::
- assumes config files are in a 'config' sub-dir of each plugin
- """
- returned = {}
-
- plugin_config_path = os.path.join( plugin_path, 'config' )
- if not os.path.isdir( plugin_config_path ):
- return returned
-
- for xml_filepath in glob.glob( os.path.join( plugin_config_path, '*.xml' ) ):
- try:
- visualization_name, visualization = self.parse_file( xml_filepath )
- # skip vis' with parsing errors - don't shutdown the startup
- except ParsingException, parse_exc:
- log.error( 'Skipped visualization config "%s" due to parsing errors: %s',
- xml_filepath, str( parse_exc ), exc_info=self.debug )
-
- if visualization:
- returned[ visualization_name ] = visualization
- log.debug( 'Visualization config loaded for: %s', visualization_name )
-
- return returned
-
def parse_file( self, xml_filepath ):
"""
Parse the given XML file for visualizations data.
:returns: tuple of ( `visualization_name`, `visualization` )
"""
- xml_tree = galaxy.util.parse_xml( xml_filepath )
- visualization_conf = xml_tree.getroot()
- visualization_name = visualization_conf.get( 'name' )
- visualization = self.parse_visualization( visualization_conf )
- return visualization_name, visualization
+ try:
+ xml_tree = galaxy.util.parse_xml( xml_filepath )
+ visualization = self.parse_visualization( xml_tree.getroot() )
+ return visualization
+ # skip vis' with parsing errors - don't shutdown the startup
+ except ParsingException, parse_exc:
+ log.exception( 'Skipped visualization config "%s" due to parsing errors', xml_filepath )
+ return None
def parse_visualization( self, xml_tree ):
"""
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -587,8 +587,10 @@
hda_dict[ 'api_type' ] = "file"
# Add additional attributes that depend on trans can hence must be added here rather than at the model level.
-
- #NOTE: access is an expensive operation - removing it and adding the precondition of access is already checked
+ can_access_hda = trans.app.security_agent.can_access_dataset( trans.get_current_user_roles(), hda.dataset )
+ can_access_hda = ( trans.user_is_admin() or can_access_hda )
+ if not can_access_hda:
+ return self.get_inaccessible_hda_dict( trans, hda )
hda_dict[ 'accessible' ] = True
# ---- return here if deleted AND purged OR can't access
@@ -634,6 +636,18 @@
return trans.security.encode_dict_ids( hda_dict )
+ def get_inaccessible_hda_dict( self, trans, hda ):
+ return trans.security.encode_dict_ids({
+ 'id' : hda.id,
+ 'history_id': hda.history.id,
+ 'hid' : hda.hid,
+ 'name' : hda.name,
+ 'state' : hda.state,
+ 'deleted' : hda.deleted,
+ 'visible' : hda.visible,
+ 'accessible': False
+ })
+
def get_hda_dict_with_error( self, trans, hda, error_msg='' ):
return trans.security.encode_dict_ids({
'id' : hda.id,
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/web/base/pluginframework.py
--- a/lib/galaxy/web/base/pluginframework.py
+++ b/lib/galaxy/web/base/pluginframework.py
@@ -1,150 +1,106 @@
"""
Base class for plugins - frameworks or systems that may:
+ * add code at startup
+ * allow hooks to be called
+and base class for plugins that:
* serve static content
* serve templated html
* have some configuration at startup
"""
import os.path
-import glob
import sys
+import imp
import pkg_resources
pkg_resources.require( 'MarkupSafe' )
pkg_resources.require( 'Mako' )
import mako
-from galaxy.util import listify
+from galaxy import util
+from galaxy.util import odict
+from galaxy.util import bunch
import logging
log = logging.getLogger( __name__ )
# ============================================================================= exceptions
-class PluginFrameworkException( Exception ):
+class PluginManagerException( Exception ):
"""Base exception for plugin frameworks.
"""
pass
-class PluginFrameworkConfigException( PluginFrameworkException ):
+class PluginManagerConfigException( PluginManagerException ):
"""Exception for plugin framework configuration errors.
"""
pass
-class PluginFrameworkStaticException( PluginFrameworkException ):
- """Exception for plugin framework static directory set up errors.
- """
- pass
-class PluginFrameworkTemplateException( PluginFrameworkException ):
- """Exception for plugin framework template directory
- and template rendering errors.
- """
- pass
# ============================================================================= base
-class PluginFramework( object ):
+class PluginManager( object ):
"""
- Plugins are files/directories living outside the Galaxy ``lib`` directory
- that serve static files (css, js, images, etc.), use and serve mako templates,
- and have some configuration to control the rendering.
+ Plugins represents an section of code that is not tracked in the
+ Galaxy repository, allowing the addition of custom code to a Galaxy
+ installation without changing the code base.
- A plugin framework sets up all the above components.
+ A PluginManager discovers and manages these plugins.
+
+ This is an non-abstract class but it's usefulness is limited and is meant
+ to be inherited.
"""
- #: does the class need a config file(s) to be parsed?
- has_config = True
- #: does the class need static files served?
- serves_static = True
- #: does the class need template files served?
- serves_templates = True
- #TODO: allow plugin mako inheritance from existing ``/templates`` files
- #uses_galaxy_templates = True
- #TODO: possibly better as instance var (or a combo)
- #: the directories in ``plugin_directory`` with basenames listed here will
- #: be ignored for config, static, and templates
- non_plugin_directories = []
- # ------------------------------------------------------------------------- setup
- @classmethod
- def from_config( cls, config_plugin_directory, config ):
+ def __init__( self, app, directories_setting=None, skip_bad_plugins=True, **kwargs ):
"""
- Set up the framework based on data from some config object by:
- * constructing it's absolute plugin_directory filepath
- * getting a template_cache
- * and appending itself to the config object's ``plugin_frameworks`` list
+ Set up the manager and load all plugins.
- .. note::
- precondition: config obj should have attributes:
- root, template_cache, and (list) plugin_frameworks
+ :type app: UniverseApplication
+ :param app: the application (and it's configuration) using this manager
+ :type directories_setting: string (default: None)
+ :param directories_setting: the filesystem path (or paths)
+ to search for plugins. Can be CSV string of paths. Will be treated as
+ absolute if a path starts with '/', relative otherwise.
+ :type skip_bad_plugins: boolean (default: True)
+ :param skip_bad_plugins: whether to skip plugins that cause
+ exceptions when loaded or to raise that exception
"""
- # currently called from (base) app.py - defined here to allow override if needed
- if not config_plugin_directory:
- return None
- try:
- # create the plugin path and if plugin dir begins with '/' assume absolute path
- full_plugin_filepath = os.path.join( config.root, config_plugin_directory )
- if config_plugin_directory.startswith( os.path.sep ):
- full_plugin_filepath = config_plugin_directory
- if not os.path.exists( full_plugin_filepath ):
- raise PluginFrameworkException( 'Plugin path not found: %s' %( full_plugin_filepath ) )
+ log.debug( 'PluginManager.init: %s, %s', directories_setting, kwargs )
+ self.directories = []
+ self.skip_bad_plugins = skip_bad_plugins
+ self.plugins = odict.odict()
- template_cache = config.template_cache if cls.serves_static else None
- plugin = cls( full_plugin_filepath, template_cache )
+ self.directories = self.parse_directories_setting( app.config.root, directories_setting )
+ #log.debug( '\t directories: %s', self.directories )
- config.plugin_frameworks.append( plugin )
- return plugin
+ self.load_configuration()
+ self.load_plugins()
- except PluginFrameworkException, plugin_exc:
- log.exception( "Error loading framework %s. Skipping...", cls.__class__.__name__ )
- return None
+ def parse_directories_setting( self, galaxy_root, directories_setting ):
+ """
+ Parse the ``directories_setting`` into a list of relative or absolute
+ filesystem paths that will be searched to discover plugins.
- def __str__( self ):
- return '%s(%s)' %( self.__class__.__name__, self.plugin_directories )
+ :type galaxy_root: string
+ :param galaxy_root: the root path of this galaxy installation
+ :type directories_setting: string (default: None)
+ :param directories_setting: the filesystem path (or paths)
+ to search for plugins. Can be CSV string of paths. Will be treated as
+ absolute if a path starts with '/', relative otherwise.
+ :rtype: list of strings
+ :returns: list of filesystem paths
+ """
+ directories = []
+ if not directories_setting:
+ return directories
- def __init__( self, plugin_directories, name=None, template_cache_dir=None, debug=False, assert_exists=True ):
- """
- :type plugin_directories: string or list
- :param plugin_directories: the base directory where plugin code is kept
- :type name: (optional) string (default: None)
- :param name: the name of this plugin
- (that will appear in url pathing, etc.)
- :type template_cache_dir: (optional) string (default: None)
- :param template_cache_dir: the cache directory to store compiled mako
- :type assert_exists: (optional) bool (default: False)
- :param assert_exists: If True, each configured plugin directory must exist.
- """
- self.plugin_directories = listify( plugin_directories )
- if assert_exists:
- for plugin_directory in self.plugin_directories:
- if not os.path.isdir( plugin_directory ):
- raise PluginFrameworkException( 'Framework plugin directory not found: %s, %s'
- % ( self.__class__.__name__, plugin_directory ) )
+ for directory in util.listify( directories_setting ):
+ directory = directory.strip()
+ if not directory.startswith( '/' ):
+ directory = os.path.join( galaxy_root, directory )
+ if not os.path.exists( directory ):
+ log.warn( '%s, directory not found: %s', self, directory )
+ continue
+ directories.append( directory )
+ return directories
- #TODO: or pass in from config
- self.name = name or os.path.basename( self.plugin_directories[0] )
-
- if self.has_config:
- self.load_configuration()
- # set_up_static_urls will be called during the static middleware creation (if serves_static)
- if self.serves_templates:
- self.set_up_templates( template_cache_dir )
-
- def get_plugin_directories( self ):
- """
- Return the plugin directory paths for this plugin.
-
- Gets any directories within ``plugin_directory`` that are directories
- themselves and whose ``basename`` is not in ``plugin_directory``.
- """
- # could instead explicitly list on/off in master config file
- for plugin_directory in self.plugin_directories:
- for plugin_path in glob.glob( os.path.join( plugin_directory, '*' ) ):
- if not os.path.isdir( plugin_path ):
- continue
-
- if os.path.basename( plugin_path ) in self.non_plugin_directories:
- continue
-
- yield plugin_path
-
- # ------------------------------------------------------------------------- config
def load_configuration( self ):
"""
Override to load some framework/plugin specifc configuration.
@@ -152,6 +108,385 @@
# Abstract method
return True
+ def load_plugins( self ):
+ """
+ Search ``self.directories`` for potential plugins, load them, and cache
+ in ``self.plugins``.
+ :rtype: odict
+ :returns: ``self.plugins``
+ """
+ for plugin_path in self.find_plugins():
+ try:
+ plugin = self.load_plugin( plugin_path )
+ if not plugin:
+ log.warn( '%s, plugin load failed: %s. Skipping...', self, plugin_path )
+ #NOTE: prevent silent, implicit overwrite here (two plugins in two diff directories)
+ #TODO: overwriting may be desired
+ elif plugin.name in self.plugins:
+ log.warn( '%s, plugin with name already exists: %s. Skipping...', self, plugin.name )
+ else:
+ self.plugins[ plugin.name ] = plugin
+ log.info( '%s, loaded plugin: %s', self, plugin.name )
+
+ except Exception, exc:
+ if not self.skip_bad_plugins:
+ raise
+ log.exception( 'Plugin loading raised exception: %s. Skipping...', plugin_path )
+
+ return self.plugins
+
+ def find_plugins( self ):
+ """
+ Return the directory paths of plugins within ``self.directories``.
+
+ Paths are considered a plugin path if they pass ``self.is_plugin``.
+ :rtype: string generator
+ :returns: paths of valid plugins
+ """
+ # due to the ordering of listdir, there is an implicit plugin loading order here
+ # could instead explicitly list on/off in master config file
+ for directory in self.directories:
+ for plugin_dir in os.listdir( directory ):
+ plugin_path = os.path.join( directory, plugin_dir )
+ if self.is_plugin( plugin_path ):
+ yield plugin_path
+
+ def is_plugin( self, plugin_path ):
+ """
+ Determines whether the given filesystem path contains a plugin.
+
+ In this base class, all sub-directories are considered plugins.
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the
+ potential plugin
+ :rtype: bool
+ :returns: True if the path contains a plugin
+ """
+ if not os.path.isdir( plugin_path ):
+ return False
+ return True
+
+ def load_plugin( self, plugin_path ):
+ """
+ Create, load, and/or initialize the plugin and return it.
+
+ Plugin bunches are decorated with:
+ * name : the plugin name
+ * path : the plugin path
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the plugin
+ :rtype: ``util.bunch.Bunch``
+ :returns: the loaded plugin object
+ """
+ plugin = bunch.Bunch(
+ #TODO: need a better way to define plugin names
+ # pro: filesystem name ensures uniqueness
+ # con: rel. inflexible
+ name = os.path.split( plugin_path )[1],
+ path = plugin_path
+ )
+ return plugin
+
+
+# ============================================================================= plugin managers using hooks
+class HookPluginManager( PluginManager ):
+ """
+ A hook plugin is a directory containing python modules or packages that:
+ * allow creating, including, and running custom code at specific 'hook'
+ points/events
+ * are not tracked in the Galaxy repository and allow adding custom code
+ to a Galaxy installation
+
+ A HookPluginManager imports the plugin code needed and calls the plugin's
+ hook functions at the specified time.
+ """
+ #: the python file that will be imported - hook functions should be contained here
+ loading_point_filename = 'plugin.py'
+ hook_fn_prefix = 'hook_'
+
+ def is_plugin( self, plugin_path ):
+ """
+ Determines whether the given filesystem path contains a hookable plugin.
+
+ All sub-directories that contain ``loading_point_filename`` are considered
+ plugins.
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the
+ potential plugin
+ :rtype: bool
+ :returns: True if the path contains a plugin
+ """
+ if not super( HookPluginManager, self ).is_plugin( plugin_path ):
+ return False
+ #TODO: possibly switch to <plugin.name>.py or __init__.py
+ if self.loading_point_filename not in os.listdir( plugin_path ):
+ return False
+ return True
+
+ def load_plugin( self, plugin_path ):
+ """
+ Import the plugin ``loading_point_filename`` and attach to the plugin bunch.
+
+ Plugin bunches are decorated with:
+ * name : the plugin name
+ * path : the plugin path
+ * module : the plugin code
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the plugin
+ :rtype: ``util.bunch.Bunch``
+ :returns: the loaded plugin object
+ """
+ plugin = super( HookPluginManager, self ).load_plugin( plugin_path )
+
+ loading_point_name = self.loading_point_filename[:-3]
+ plugin[ 'module' ] = self.import_plugin_module( loading_point_name, plugin )
+ return plugin
+
+ def import_plugin_module( self, loading_point_name, plugin, import_as=None ):
+ """
+ Import the plugin code and cache the module in the plugin object.
+
+ :type loading_point_name: string
+ :param loading_point_name: name of the python file to import (w/o extension)
+ :type plugin: ``util.bunch.Bunch``
+ :param plugin: the plugin containing the template to render
+ :type import_as: string
+ :param import_as: namespace to use for imported module
+ This will be prepended with the ``__name__`` of this file.
+ Defaults to ``plugin.name``
+ :rtype: ``util.bunch.Bunch``
+ :returns: the loaded plugin object
+ """
+ # add this name to import_as (w/ default to plugin.name) to prevent namespace pollution in sys.modules
+ import_as = '%s.%s' %( __name__, ( import_as or plugin.name ) )
+ module_file, pathname, description = imp.find_module( loading_point_name, [ plugin.path ] )
+ try:
+ #TODO: hate this hack but only way to get package imports inside the plugin to work?
+ sys.path.append( plugin.path )
+ # sys.modules will now have import_as in it's list
+ module = imp.load_module( import_as, module_file, pathname, description )
+ finally:
+ module_file.close()
+ if plugin.path in sys.path:
+ sys.path.remove( plugin.path )
+ return module
+
+ def run_hook( self, hook_name, *args, **kwargs ):
+ """
+ Search all plugins for a function named ``hook_fn_prefix`` + ``hook_name``
+ and run it passing in args and kwargs.
+
+ Return values from each hook are returned in a dictionary keyed with the
+ plugin names.
+
+ :type hook_name: string
+ :param hook_name: name (suffix) of the hook to run
+ :rtype: dictionary
+ :returns: where keys are plugin.names and
+ values return values from the hooks
+ """
+ #TODO: is hook prefix necessary?
+ #TODO: could be made more efficient if cached by hook_name in the manager on load_plugin
+ # (low maint. overhead since no dynamic loading/unloading of plugins)
+ hook_fn_name = ''.join([ self.hook_fn_prefix, hook_name ])
+ returned = {}
+ for plugin_name, plugin in self.plugins.items():
+ hook_fn = getattr( plugin.module, hook_fn_name, None )
+
+ if hook_fn and hasattr( hook_fn, '__call__' ):
+ try:
+ #log.debug( 'calling %s from %s(%s)', hook_fn.func_name, plugin.name, plugin.module )
+ fn_returned = hook_fn( *args, **kwargs )
+ returned[ plugin.name ] = fn_returned
+ except Exception, exc:
+ # fail gracefully and continue with other plugins
+ log.exception( 'Hook function "%s" failed for plugin "%s"', hook_name, plugin.name )
+
+ # not sure of utility of this - seems better to be fire-and-forget pub-sub
+ return returned
+
+ def filter_hook( self, hook_name, hook_arg, *args, **kwargs ):
+ """
+ Search all plugins for a function named ``hook_fn_prefix`` + ``hook_name``
+ and run the first with ``hook_arg`` and every function after with the
+ return value of the previous.
+
+ ..note:
+ This makes plugin load order very important.
+
+ :type hook_name: string
+ :param hook_name: name (suffix) of the hook to run
+ :type hook_arg: any
+ :param hook_arg: the arg to be passed between hook functions
+ :rtype: any
+ :returns: the modified hook_arg
+ """
+ hook_fn_name = ''.join([ self.hook_fn_prefix, hook_name ])
+ for plugin_name, plugin in self.plugins.items():
+ hook_fn = getattr( plugin.module, hook_fn_name, None )
+
+ if hook_fn and hasattr( hook_fn, '__call__' ):
+ try:
+ hook_arg = hook_fn( hook_arg, *args, **kwargs )
+
+ except Exception, exc:
+ # fail gracefully and continue with other plugins
+ log.exception( 'Filter hook function "%s" failed for plugin "%s"', hook_name, plugin.name )
+
+ # may have been altered by hook fns, return
+ return hook_arg
+
+
+# ============================================================================= exceptions
+class PluginManagerStaticException( PluginManagerException ):
+ """Exception for plugin framework static directory set up errors.
+ """
+ pass
+class PluginManagerTemplateException( PluginManagerException ):
+ """Exception for plugin framework template directory
+ and template rendering errors.
+ """
+ pass
+
+
+# ============================================================================= base
+class PageServingPluginManager( PluginManager ):
+ """
+ Page serving plugins are files/directories that:
+ * are not tracked in the Galaxy repository and allow adding custom code
+ to a Galaxy installation
+ * serve static files (css, js, images, etc.),
+ * render templates
+
+ A PageServingPluginManager sets up all the above components.
+ """
+ #TODO: I'm unclear of the utility of this class - it prob. will only have one subclass (vis reg). Fold into?
+
+ #: does the class need static files served?
+ serves_static = True
+ #: does the class need template files served?
+ serves_templates = True
+ #: default number of templates to search for plugin template lookup
+ DEFAULT_TEMPLATE_COLLECTION_SIZE = 10
+ #: default encoding of plugin templates
+ DEFAULT_TEMPLATE_ENCODING = 'utf-8'
+
+ def __init__( self, app, base_url, template_cache_dir=None, **kwargs ):
+ """
+ Set up the manager and load all plugins.
+
+ :type app: UniverseApplication
+ :param app: the application (and it's configuration) using this manager
+ :type base_url: string
+ :param base_url: url to prefix all plugin urls with
+ :type template_cache_dir: string
+ :param template_cache_dir: filesytem path to the directory where cached
+ templates are kept
+ """
+ self.base_url = base_url
+ self.template_cache_dir = template_cache_dir
+
+ super( PageServingPluginManager, self ).__init__( app, **kwargs )
+
+ def is_plugin( self, plugin_path ):
+ """
+ Determines whether the given filesystem path contains a plugin.
+
+ If the manager ``serves_templates`` and a sub-directory contains another
+ sub-directory named 'templates' it's considered valid.
+ If the manager ``serves_static`` and a sub-directory contains another
+ sub-directory named 'static' it's considered valid.
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the
+ potential plugin
+ :rtype: bool
+ :returns: True if the path contains a plugin
+ """
+ if not super( PageServingPluginManager, self ).is_plugin( plugin_path ):
+ return False
+ # reject only if we don't have either
+ listdir = os.listdir( plugin_path )
+ if( ( 'templates' not in listdir )
+ and ( 'static' not in listdir ) ):
+ return False
+ return True
+
+ def load_plugin( self, plugin_path ):
+ """
+ Create the plugin and decorate with static and/or template paths and urls.
+
+ Plugin bunches are decorated with:
+ * name : the plugin name
+ * path : the plugin path
+ * base_url : a url to the plugin
+
+ :type plugin_path: string
+ :param plugin_path: relative or absolute filesystem path to the plugin
+ :rtype: ``util.bunch.Bunch``
+ :returns: the loaded plugin object
+ """
+ plugin = super( PageServingPluginManager, self ).load_plugin( plugin_path )
+ #TODO: urlencode?
+ plugin[ 'base_url' ] = '/'.join([ self.base_url, plugin.name ])
+ plugin = self._set_up_static_plugin( plugin )
+ plugin = self._set_up_template_plugin( plugin )
+
+ return plugin
+
+ def _set_up_static_plugin( self, plugin ):
+ """
+ Decorate the plugin with paths and urls needed to serve static content.
+
+ Plugin bunches are decorated with:
+ * serves_static : whether this plugin will serve static content
+
+ If the plugin path contains a 'static' sub-dir, the following are added:
+ * static_path : the filesystem path to the static content
+ * static_url : the url to use when serving static content
+
+ :type plugin: ``util.bunch.Bunch``
+ :param plugin: the plugin to decorate
+ :rtype: ``util.bunch.Bunch``
+ :returns: the loaded plugin object
+ """
+ plugin[ 'serves_static' ] = False
+ static_path = os.path.join( plugin.path, 'static' )
+ if self.serves_static and os.path.isdir( static_path ):
+ plugin.serves_static = True
+ plugin[ 'static_path' ] = static_path
+ plugin[ 'static_url' ] = '/'.join([ plugin.base_url, 'static' ])
+ return plugin
+
+ def _set_up_template_plugin( self, plugin ):
+ """
+ Decorate the plugin with paths needed to fill templates.
+
+ Plugin bunches are decorated with:
+ * serves_templates : whether this plugin will use templates
+
+ If the plugin path contains a 'static' sub-dir, the following are added:
+ * template_path : the filesystem path to the template sub-dir
+ * template_lookup : the (currently Mako) TemplateLookup used to search
+ for templates
+
+ :type plugin: ``util.bunch.Bunch``
+ :param plugin: the plugin to decorate
+ :rtype: ``util.bunch.Bunch``
+ :returns: the loaded plugin object
+ """
+ plugin[ 'serves_templates' ] = False
+ template_path = os.path.join( plugin.path, 'templates' )
+ if self.serves_templates and os.path.isdir( template_path ):
+ plugin.serves_templates = True
+ plugin[ 'template_path' ] = template_path
+ plugin[ 'template_lookup' ] = self.build_plugin_template_lookup( plugin )
+ return plugin
+
# ------------------------------------------------------------------------- serving static files
def get_static_urls_and_paths( self ):
"""
@@ -160,81 +495,63 @@
same files.
Meant to be passed to a Static url map.
+
+ :rtype: list of 2-tuples
+ :returns: all urls and paths for each plugin serving static content
"""
- url_and_paths = []
# called during the static middleware creation (buildapp.py, wrap_in_static)
-
- # NOTE: this only searches for static dirs two levels deep (i.e. <plugin_directory>/<plugin-name>/static)
- for plugin_path in self.get_plugin_directories():
- # that path is a plugin, search for subdirs named static in THAT dir
- plugin_static_path = os.path.join( plugin_path, 'static' )
- if not os.path.isdir( plugin_static_path ):
- continue
-
- # build a url for that static subdir and create a Static urlmap entry for it
- plugin_name = os.path.splitext( os.path.basename( plugin_path ) )[0]
- plugin_url = self.name + '/' + plugin_name + '/static'
- url_and_paths.append( ( plugin_url, plugin_static_path ) )
-
- return url_and_paths
+ urls_and_paths = []
+ for plugin in self.plugins.values():
+ if plugin.serves_static:
+ urls_and_paths.append( ( plugin.static_url, plugin.static_path ) )
+ return urls_and_paths
# ------------------------------------------------------------------------- templates
- def set_up_templates( self, template_cache_dir ):
+ def build_plugin_template_lookup( self, plugin ):
"""
- Add a ``template_lookup`` attribute to the framework that can be passed
- to the mako renderer to find templates.
+ Builds the object that searches for templates (cached or not) when rendering.
+
+ :type plugin: ``util.bunch.Bunch``
+ :param plugin: the plugin containing the templates
+ :rtype: ``Mako.lookup.TemplateLookup``
+ :returns: template lookup for this plugin
"""
- if not template_cache_dir:
- raise PluginFrameworkTemplateException( 'Plugins that serve templates require a template_cache_dir' )
- self.template_lookup = self._create_mako_template_lookup( template_cache_dir, self._get_template_paths() )
- return self.template_lookup
+ if not plugin.serves_templates:
+ return None
+ template_lookup = self._create_mako_template_lookup( self.template_cache_dir, plugin.template_path )
+ return template_lookup
- def _get_template_paths( self ):
- """
- Get the paths that will be searched for templates.
- """
- return self.plugin_directories
-
- def _create_mako_template_lookup( self, cache_dir, paths, collection_size=500, output_encoding='utf-8' ):
+ def _create_mako_template_lookup( self, cache_dir, paths,
+ collection_size=DEFAULT_TEMPLATE_COLLECTION_SIZE, output_encoding=DEFAULT_TEMPLATE_ENCODING ):
"""
Create a ``TemplateLookup`` with defaults.
+
+ :rtype: ``Mako.lookup.TemplateLookup``
+ :returns: all urls and paths for each plugin serving static content
"""
+ #TODO: possible to add galaxy/templates into the lookup here?
return mako.lookup.TemplateLookup(
directories = paths,
module_directory = cache_dir,
collection_size = collection_size,
output_encoding = output_encoding )
- #TODO: do we want to remove trans and app from the plugin template context?
- def fill_template( self, trans, template_filename, **kwargs ):
+ def fill_template( self, trans, plugin, template_filename, **kwargs ):
"""
- Pass control over to trans and render the ``template_filename``.
+ Pass control over to trans and render ``template_filename``.
+
+ :type trans: ``galaxy.web.framework.GalaxyWebTransaction``
+ :param trans: transaction doing the rendering
+ :type plugin: ``util.bunch.Bunch``
+ :param plugin: the plugin containing the template to render
+ :type template_filename: string
+ :param template_filename: the path of the template to render relative to
+ ``plugin.template_path``
+ :returns: rendered template
"""
# defined here to be overridden
- return trans.fill_template( template_filename, template_lookup=self.template_lookup, **kwargs )
+ return trans.fill_template( template_filename, template_lookup=plugin.template_lookup, **kwargs )
- def fill_template_with_plugin_imports( self, trans, template_filename, **kwargs ):
- """
- Returns a rendered plugin template but allows importing modules from inside
- the plugin directory within the template.
-
- ..example:: I.e. given this layout for a plugin:
- bler/
- template/
- bler.mako
- static/
- conifg/
- my_script.py
- this version of `fill_template` allows `bler.mako` to call `import my_script`.
- """
- try:
- plugin_path = os.path.dirname( os.path.dirname( template_filename ) )
- sys.path.append( plugin_path )
- filled_template = self.fill_template( trans, template_filename, **kwargs )
-
- finally:
- sys.path.remove( plugin_path )
-
- return filled_template
-
- #TODO: could add plugin template helpers here
+ #TODO: add fill_template fn that is able to load extra libraries beforehand (and remove after)
+ #TODO: add template helpers specific to the plugins
+ #TODO: some sort of url_for for these plugins
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/web/framework/helpers/grids.py
--- a/lib/galaxy/web/framework/helpers/grids.py
+++ b/lib/galaxy/web/framework/helpers/grids.py
@@ -198,7 +198,10 @@
if page_num == 0:
# Show all rows in page.
total_num_rows = query.count()
+ # persistant page='all'
page_num = 1
+ #page_num = 'all'
+ #extra_url_args['page'] = page_num
num_pages = 1
else:
# Show a limited number of rows. Before modifying query, get the total number of rows that query
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/galaxy/api/history_contents.py
--- a/lib/galaxy/webapps/galaxy/api/history_contents.py
+++ b/lib/galaxy/webapps/galaxy/api/history_contents.py
@@ -235,8 +235,15 @@
# get file destination
file_destination = dataset.get_file_name()
+ # check if the directory exists
+ dn = os.path.dirname(file_destination)
+ if not os.path.exists(dn):
+ os.makedirs(dn)
+
+ # get file and directory names
+ fn = os.path.basename(content.filename)
+
# save file locally
- fn = os.path.basename(content.filename)
open(file_destination, 'wb').write(content.file.read())
# log
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/galaxy/api/tools.py
--- a/lib/galaxy/webapps/galaxy/api/tools.py
+++ b/lib/galaxy/webapps/galaxy/api/tools.py
@@ -32,7 +32,7 @@
# Create return value.
try:
- return self.app.toolbox.to_dict( trans, in_panel=in_panel, trackster=trackster )
+ return self.app.toolbox.to_dict( trans, in_panel=in_panel, trackster=trackster)
except Exception, exc:
log.error( 'could not convert toolbox to dictionary: %s', str( exc ), exc_info=True )
trans.response.status = 500
@@ -86,6 +86,17 @@
for k, v in payload.iteritems():
if k.startswith("files_"):
inputs[k] = v
+
+ #for inputs that are coming from the Library, copy them into the history
+ input_patch = {}
+ for k, v in inputs.iteritems():
+ if isinstance(v, dict) and v.get('src', '') == 'ldda' and 'id' in v:
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( trans.security.decode_id(v['id']) )
+ if trans.user_is_admin() or trans.app.security_agent.can_access_dataset( trans.get_current_user_roles(), ldda.dataset ):
+ input_patch[k] = ldda.to_history_dataset_association(target_history, add_to_history=True)
+
+ for k, v in input_patch.iteritems():
+ inputs[k] = v
# HACK: add run button so that tool.handle_input will run tool.
inputs['runtool_btn'] = 'Execute'
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/galaxy/buildapp.py
--- a/lib/galaxy/webapps/galaxy/buildapp.py
+++ b/lib/galaxy/webapps/galaxy/buildapp.py
@@ -193,7 +193,8 @@
if kwargs.get( 'middleware', True ):
webapp = wrap_in_middleware( webapp, global_conf, **kwargs )
if asbool( kwargs.get( 'static_enabled', True ) ):
- webapp = wrap_in_static( webapp, global_conf, plugin_frameworks=app.config.plugin_frameworks, **kwargs )
+ webapp = wrap_in_static( webapp, global_conf, plugin_frameworks=[ app.visualizations_registry ], **kwargs )
+ #webapp = wrap_in_static( webapp, global_conf, plugin_frameworks=None, **kwargs )
if asbool(kwargs.get('pack_scripts', False)):
pack_scripts()
# Close any pooled database connections before forking
@@ -362,12 +363,13 @@
# wrap any static dirs for plugins
plugin_frameworks = plugin_frameworks or []
- for static_serving_framework in ( framework for framework in plugin_frameworks if framework.serves_static ):
- # invert control to each plugin for finding their own static dirs
- for plugin_url, plugin_static_path in static_serving_framework.get_static_urls_and_paths():
- plugin_url = '/plugins/' + plugin_url
- urlmap[( plugin_url )] = Static( plugin_static_path, cache_time )
- log.debug( 'added url, path to static middleware: %s, %s', plugin_url, plugin_static_path )
+ for framework in plugin_frameworks:
+ if framework and framework.serves_static:
+ # invert control to each plugin for finding their own static dirs
+ for plugin_url, plugin_static_path in framework.get_static_urls_and_paths():
+ plugin_url = '/plugins/' + plugin_url
+ urlmap[( plugin_url )] = Static( plugin_static_path, cache_time )
+ log.debug( 'added url, path to static middleware: %s, %s', plugin_url, plugin_static_path )
# URL mapper becomes the root webapp
return urlmap
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -812,7 +812,7 @@
status = kwd.get( 'status', 'done' )
shed_tool_conf = kwd.get( 'shed_tool_conf', None )
tool_shed_url = kwd[ 'tool_shed_url' ]
- # Handle repository dependencies.
+ # Handle repository dependencies, which do not include those that are required only for compiling a dependent repository's tool dependencies.
has_repository_dependencies = util.string_as_bool( kwd.get( 'has_repository_dependencies', False ) )
install_repository_dependencies = kwd.get( 'install_repository_dependencies', '' )
# Every repository will be installed into the same tool panel section or all will be installed outside of any sections.
@@ -1061,7 +1061,7 @@
repository_clone_url,
metadata,
trans.model.ToolShedRepository.installation_status.NEW,
- tool_shed_repository.installed_changeset_revision,
+ tool_shed_repository.changeset_revision,
tool_shed_repository.owner,
tool_shed_repository.dist_to_shed )
ctx_rev = suc.get_ctx_rev( trans.app,
@@ -1320,7 +1320,6 @@
missing_tool_dependencies = dependencies_for_repository_dict.get( 'missing_tool_dependencies', None )
repository_name = dependencies_for_repository_dict.get( 'name', None )
repository_owner = dependencies_for_repository_dict.get( 'repository_owner', None )
-
if installed_repository_dependencies or missing_repository_dependencies:
has_repository_dependencies = True
else:
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/galaxy/controllers/history.py
--- a/lib/galaxy/webapps/galaxy/controllers/history.py
+++ b/lib/galaxy/webapps/galaxy/controllers/history.py
@@ -309,7 +309,6 @@
# If deleting the current history, make a new current.
if history == trans.get_history():
deleted_current = True
- trans.get_or_create_default_history()
trans.log_event( "History (%s) marked as deleted" % history.name )
n_deleted += 1
if purge and trans.app.config.allow_user_dataset_purge:
@@ -339,6 +338,8 @@
part += " but the datasets were not removed from disk because that feature is not enabled in this Galaxy instance"
message_parts.append( "%s. " % part )
if deleted_current:
+ #note: this needs to come after commits above or will use an empty history that was deleted above
+ trans.get_or_create_default_history()
message_parts.append( "Your active history was deleted, a new empty history is now active. " )
status = INFO
return ( status, " ".join( message_parts ) )
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/galaxy/controllers/visualization.py
--- a/lib/galaxy/webapps/galaxy/controllers/visualization.py
+++ b/lib/galaxy/webapps/galaxy/controllers/visualization.py
@@ -698,10 +698,10 @@
registry = trans.app.visualizations_registry
if not registry:
raise HTTPNotFound( 'No visualization registry (possibly disabled in universe_wsgi.ini)' )
- if visualization_name not in registry.listings:
+ if visualization_name not in registry.plugins:
raise HTTPNotFound( 'Unknown or invalid visualization: ' + visualization_name )
# or redirect to list?
- registry_listing = registry.listings[ visualization_name ]
+ plugin = registry.plugins[ visualization_name ]
returned = None
try:
@@ -711,8 +711,8 @@
resources = registry.query_dict_to_resources( trans, self, visualization_name, kwargs )
# look up template and render
- template_path = registry_listing[ 'template' ]
- returned = registry.fill_template( trans, template_path,
+ template_path = plugin.config[ 'template' ]
+ returned = registry.fill_template( trans, plugin, template_path,
visualization_name=visualization_name, query_args=kwargs,
embedded=embedded, shared_vars={}, **resources )
#NOTE: passing *unparsed* kwargs as query_args
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/tool_shed/api/repositories.py
--- a/lib/galaxy/webapps/tool_shed/api/repositories.py
+++ b/lib/galaxy/webapps/tool_shed/api/repositories.py
@@ -73,6 +73,7 @@
"changeset_revision": "3a08cc21466f",
"downloadable": true,
"has_repository_dependencies": false,
+ "has_repository_dependencies_only_if_compiling_contained_td": false,
"id": "f9cad7b01a472135",
"includes_datatypes": false,
"includes_tool_dependencies": false,
@@ -125,7 +126,8 @@
action='show',
id=encoded_repository_metadata_id )
# Get the repo_info_dict for installing the repository.
- repo_info_dict, includes_tools, includes_tool_dependencies, includes_tools_for_display_in_tool_panel, has_repository_dependencies = \
+ repo_info_dict, includes_tools, includes_tool_dependencies, includes_tools_for_display_in_tool_panel, \
+ has_repository_dependencies, has_repository_dependencies_only_if_compiling_contained_td = \
repository_util.get_repo_info_dict( trans, encoded_repository_id, changeset_revision )
return repository_dict, repository_metadata_dict, repo_info_dict
else:
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/tool_shed/controllers/repository.py
--- a/lib/galaxy/webapps/tool_shed/controllers/repository.py
+++ b/lib/galaxy/webapps/tool_shed/controllers/repository.py
@@ -1343,32 +1343,36 @@
def get_changeset_revision_and_ctx_rev( self, trans, **kwd ):
"""Handle a request from a local Galaxy instance to retrieve the changeset revision hash to which an installed repository can be updated."""
def has_galaxy_utilities( repository_metadata ):
- includes_data_managers = False
- includes_datatypes = False
- includes_tools = False
- includes_tools_for_display_in_tool_panel = False
- has_repository_dependencies = False
- includes_tool_dependencies = False
- includes_workflows = False
+ has_galaxy_utilities_dict = dict( includes_data_managers=False,
+ includes_datatypes=False,
+ includes_tools=False,
+ includes_tools_for_display_in_tool_panel=False,
+ has_repository_dependencies=False,
+ has_repository_dependencies_only_if_compiling_contained_td=False,
+ includes_tool_dependencies=False,
+ includes_workflows=False )
if repository_metadata:
includes_tools_for_display_in_tool_panel = repository_metadata.includes_tools_for_display_in_tool_panel
metadata = repository_metadata.metadata
if metadata:
if 'data_manager' in metadata:
- includes_data_managers = True
+ has_galaxy_utilities_dict[ 'includes_data_managers' ] = True
if 'datatypes' in metadata:
- includes_datatypes = True
+ has_galaxy_utilities_dict[ 'includes_datatypes' ] = True
if 'tools' in metadata:
- includes_tools = True
+ has_galaxy_utilities_dict[ 'includes_tools' ] = True
if 'tool_dependencies' in metadata:
- includes_tool_dependencies = True
- if 'repository_dependencies' in metadata:
- has_repository_dependencies = True
+ has_galaxy_utilities_dict[ 'includes_tool_dependencies' ] = True
+ repository_dependencies_dict = metadata.get( 'repository_dependencies', {} )
+ repository_dependencies = repository_dependencies_dict.get( 'repository_dependencies', [] )
+ has_repository_dependencies, has_repository_dependencies_only_if_compiling_contained_td = \
+ suc.get_repository_dependency_types( repository_dependencies )
+ has_galaxy_utilities_dict[ 'has_repository_dependencies' ] = has_repository_dependencies
+ has_galaxy_utilities_dict[ 'has_repository_dependencies_only_if_compiling_contained_td' ] = \
+ has_repository_dependencies_only_if_compiling_contained_td
if 'workflows' in metadata:
- includes_workflows = True
- return includes_data_managers, includes_datatypes, includes_tools, includes_tools_for_display_in_tool_panel, includes_tool_dependencies, has_repository_dependencies, includes_workflows
- message = kwd.get( 'message', '' )
- status = kwd.get( 'status', 'done' )
+ has_galaxy_utilities_dict[ 'includes_workflows' ] = True
+ return has_galaxy_utilities_dict
name = kwd.get( 'name', None )
owner = kwd.get( 'owner', None )
changeset_revision = kwd.get( 'changeset_revision', None )
@@ -1376,8 +1380,15 @@
repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans,
trans.security.encode_id( repository.id ),
changeset_revision )
- includes_data_managers, includes_datatypes, includes_tools, includes_tools_for_display_in_tool_panel, includes_tool_dependencies, has_repository_dependencies, includes_workflows = \
- has_galaxy_utilities( repository_metadata )
+ has_galaxy_utilities_dict = has_galaxy_utilities( repository_metadata )
+ includes_data_managers = has_galaxy_utilities_dict[ 'includes_data_managers' ]
+ includes_datatypes = has_galaxy_utilities_dict[ 'includes_datatypes' ]
+ includes_tools = has_galaxy_utilities_dict[ 'includes_tools' ]
+ includes_tools_for_display_in_tool_panel = has_galaxy_utilities_dict[ 'includes_tools_for_display_in_tool_panel' ]
+ includes_tool_dependencies = has_galaxy_utilities_dict[ 'includes_tool_dependencies' ]
+ has_repository_dependencies = has_galaxy_utilities_dict[ 'has_repository_dependencies' ]
+ has_repository_dependencies_only_if_compiling_contained_td = has_galaxy_utilities_dict[ 'has_repository_dependencies_only_if_compiling_contained_td' ]
+ includes_workflows = has_galaxy_utilities_dict[ 'includes_workflows' ]
repo_dir = repository.repo_path( trans.app )
repo = hg.repository( suc.get_configured_ui(), repo_dir )
# Default to the received changeset revision and ctx_rev.
@@ -1392,6 +1403,7 @@
includes_tools_for_display_in_tool_panel=includes_tools_for_display_in_tool_panel,
includes_tool_dependencies=includes_tool_dependencies,
has_repository_dependencies=has_repository_dependencies,
+ has_repository_dependencies_only_if_compiling_contained_td=has_repository_dependencies_only_if_compiling_contained_td,
includes_workflows=includes_workflows )
if changeset_revision == repository.tip( trans.app ):
# If changeset_revision is the repository tip, there are no additional updates.
@@ -1407,6 +1419,7 @@
for changeset in repo.changelog:
includes_tools = False
has_repository_dependencies = False
+ has_repository_dependencies_only_if_compiling_contained_td = False
changeset_hash = str( repo.changectx( changeset ) )
ctx = suc.get_changectx_for_changeset( repo, changeset_hash )
if update_to_changeset_hash:
@@ -1414,8 +1427,15 @@
trans.security.encode_id( repository.id ),
changeset_hash )
if update_to_repository_metadata:
- includes_data_managers, includes_datatypes, includes_tools, includes_tools_for_display_in_tool_panel, includes_tool_dependencies, has_repository_dependencies, includes_workflows = \
- has_galaxy_utilities( update_to_repository_metadata )
+ has_galaxy_utilities_dict = has_galaxy_utilities( repository_metadata )
+ includes_data_managers = has_galaxy_utilities_dict[ 'includes_data_managers' ]
+ includes_datatypes = has_galaxy_utilities_dict[ 'includes_datatypes' ]
+ includes_tools = has_galaxy_utilities_dict[ 'includes_tools' ]
+ includes_tools_for_display_in_tool_panel = has_galaxy_utilities_dict[ 'includes_tools_for_display_in_tool_panel' ]
+ includes_tool_dependencies = has_galaxy_utilities_dict[ 'includes_tool_dependencies' ]
+ has_repository_dependencies = has_galaxy_utilities_dict[ 'has_repository_dependencies' ]
+ has_repository_dependencies_only_if_compiling_contained_td = has_galaxy_utilities_dict[ 'has_repository_dependencies_only_if_compiling_contained_td' ]
+ includes_workflows = has_galaxy_utilities_dict[ 'includes_workflows' ]
# We found a RepositoryMetadata record.
if changeset_hash == repository.tip( trans.app ):
# The current ctx is the repository tip, so use it.
@@ -1435,6 +1455,7 @@
update_dict[ 'includes_tool_dependencies' ] = includes_tool_dependencies
update_dict[ 'includes_workflows' ] = includes_workflows
update_dict[ 'has_repository_dependencies' ] = has_repository_dependencies
+ update_dict[ 'has_repository_dependencies_only_if_compiling_contained_td' ] = has_repository_dependencies_only_if_compiling_contained_td
update_dict[ 'changeset_revision' ] = str( latest_changeset_revision )
update_dict[ 'ctx_rev' ] = str( update_to_ctx.rev() )
return encoding_util.tool_shed_encode( update_dict )
@@ -1611,14 +1632,18 @@
includes_tools = False
includes_tools_for_display_in_tool_panel = False
has_repository_dependencies = False
+ has_repository_dependencies_only_if_compiling_contained_td = False
includes_tool_dependencies = False
repo_info_dicts = []
for tup in zip( util.listify( repository_ids ), util.listify( changeset_revisions ) ):
repository_id, changeset_revision = tup
- repo_info_dict, cur_includes_tools, cur_includes_tool_dependencies, cur_includes_tools_for_display_in_tool_panel, cur_has_repository_dependencies = \
+ repo_info_dict, cur_includes_tools, cur_includes_tool_dependencies, cur_includes_tools_for_display_in_tool_panel, \
+ cur_has_repository_dependencies, cur_has_repository_dependencies_only_if_compiling_contained_td = \
repository_util.get_repo_info_dict( trans, repository_id, changeset_revision )
if cur_has_repository_dependencies and not has_repository_dependencies:
has_repository_dependencies = True
+ if cur_has_repository_dependencies_only_if_compiling_contained_td and not has_repository_dependencies_only_if_compiling_contained_td:
+ has_repository_dependencies_only_if_compiling_contained_td = True
if cur_includes_tools and not includes_tools:
includes_tools = True
if cur_includes_tool_dependencies and not includes_tool_dependencies:
@@ -1629,6 +1654,7 @@
return dict( includes_tools=includes_tools,
includes_tools_for_display_in_tool_panel=includes_tools_for_display_in_tool_panel,
has_repository_dependencies=has_repository_dependencies,
+ has_repository_dependencies_only_if_compiling_contained_td=has_repository_dependencies_only_if_compiling_contained_td,
includes_tool_dependencies=includes_tool_dependencies,
repo_info_dicts=repo_info_dicts )
@@ -1708,7 +1734,9 @@
tool_version_dicts = []
for changeset in repo.changelog:
current_changeset_revision = str( repo.changectx( changeset ) )
- repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans, trans.security.encode_id( repository.id ), current_changeset_revision )
+ repository_metadata = suc.get_repository_metadata_by_changeset_revision( trans,
+ trans.security.encode_id( repository.id ),
+ current_changeset_revision )
if repository_metadata and repository_metadata.tool_versions:
tool_version_dicts.append( repository_metadata.tool_versions )
if current_changeset_revision == changeset_revision:
@@ -1766,22 +1794,30 @@
includes_workflows = True
readme_files_dict = readme_util.build_readme_files_dict( metadata )
# See if the repo_info_dict was populated with repository_dependencies or tool_dependencies.
+ has_repository_dependencies = False
+ has_repository_dependencies_only_if_compiling_contained_td = False
+ includes_tool_dependencies = False
for name, repo_info_tuple in repo_info_dict.items():
- description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, tool_dependencies = \
- suc.get_repo_info_tuple_contents( repo_info_tuple )
- if repository_dependencies:
- has_repository_dependencies = True
- else:
- has_repository_dependencies = False
- if tool_dependencies:
- includes_tool_dependencies = True
- else:
- includes_tool_dependencies = False
+ if not has_repository_dependencies or not has_repository_dependencies_only_if_compiling_contained_td or not includes_tool_dependencies:
+ description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, tool_dependencies = \
+ suc.get_repo_info_tuple_contents( repo_info_tuple )
+ for rd_key, rd_tups in repository_dependencies.items():
+ if rd_key in [ 'root_key', 'description' ]:
+ continue
+ curr_has_repository_dependencies, curr_has_repository_dependencies_only_if_compiling_contained_td = \
+ suc.get_repository_dependency_types( rd_tups )
+ if curr_has_repository_dependencies and not has_repository_dependencies:
+ has_repository_dependencies = True
+ if curr_has_repository_dependencies_only_if_compiling_contained_td and not has_repository_dependencies_only_if_compiling_contained_td:
+ has_repository_dependencies_only_if_compiling_contained_td = True
+ if tool_dependencies and not includes_tool_dependencies:
+ includes_tool_dependencies = True
return dict( includes_data_managers=includes_data_managers,
includes_datatypes=includes_datatypes,
includes_tools=includes_tools,
includes_tools_for_display_in_tool_panel=includes_tools_for_display_in_tool_panel,
has_repository_dependencies=has_repository_dependencies,
+ has_repository_dependencies_only_if_compiling_contained_td=has_repository_dependencies_only_if_compiling_contained_td,
includes_tool_dependencies=includes_tool_dependencies,
includes_workflows=includes_workflows,
readme_files_dict=readme_files_dict,
@@ -2434,7 +2470,9 @@
try:
commands.remove( repo.ui, repo, selected_file, force=True )
except Exception, e:
- log.debug( "Error removing files using the mercurial API, so trying a different approach, the error was: %s" % str( e ))
+ log.debug( "Error removing the following file using the mercurial API:\n %s" % str( selected_file ) )
+ log.debug( "The error was: %s" % str( e ))
+ log.debug( "Attempting to remove the file using a different approach." )
relative_selected_file = selected_file.split( 'repo_%d' % repository.id )[1].lstrip( '/' )
repo.dirstate.remove( relative_selected_file )
repo.dirstate.write()
diff -r 67be8479c89db565816a71edd41d69228b4e87ee -r 65158f203ac7b0bd540778b2771bde08557f832a lib/galaxy/webapps/tool_shed/model/__init__.py
--- a/lib/galaxy/webapps/tool_shed/model/__init__.py
+++ b/lib/galaxy/webapps/tool_shed/model/__init__.py
@@ -253,6 +253,7 @@
self.time_last_tested = time_last_tested
self.tool_test_results = tool_test_results
self.has_repository_dependencies = has_repository_dependencies
+ # We don't consider the special case has_repository_dependencies_only_if_compiling_contained_td here.
self.includes_datatypes = includes_datatypes
self.includes_tools = includes_tools
self.includes_tool_dependencies = includes_tool_dependencies
This diff is so big that we needed to truncate the remainder.
https://bitbucket.org/galaxy/galaxy-central/commits/acd70e8acbe6/
Changeset: acd70e8acbe6
User: BjoernGruening
Date: 2013-10-15 20:49:11
Summary: merge
Affected #: 585 files
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e config/disposable_email_blacklist.conf.sample
--- /dev/null
+++ b/config/disposable_email_blacklist.conf.sample
@@ -0,0 +1,9 @@
+If you want to disable registration for users that are using disposable email address
+rename this file to disposable_email_blacklist.conf and fill it with the disposable domains
+that you want to have blacklisted. Each on its own line without the '@' character as shown below.
+Users using emails from these domains will get an error during the registration.
+
+mailinator.com
+sogetthis.com
+spamgourmet.com
+trashmail.net
\ No newline at end of file
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e datatypes_conf.xml.sample
--- a/datatypes_conf.xml.sample
+++ b/datatypes_conf.xml.sample
@@ -245,10 +245,6 @@
<datatype extension="snpmatrix" type="galaxy.datatypes.genetics:SNPMatrix" display_in_upload="true"/><datatype extension="xls" type="galaxy.datatypes.tabular:Tabular"/><!-- End RGenetics Datatypes -->
- <!-- graph datatypes -->
- <datatype extension="xgmml" type="galaxy.datatypes.graph:Xgmml" display_in_upload="true"/>
- <datatype extension="sif" type="galaxy.datatypes.graph:Sif" display_in_upload="true"/>
- <datatype extension="rdf" type="galaxy.datatypes.graph:Rdf" display_in_upload="true"/></registration><sniffers><!--
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e dist-eggs.ini
--- a/dist-eggs.ini
+++ b/dist-eggs.ini
@@ -1,88 +1,54 @@
;
; Config for building eggs for distribution (via a site such as
-; eggs.g2.bx.psu.edu) Probably only useful to Galaxy developers at
-; Penn State. This file is used by scripts/dist-scramble.py
+; eggs.galaxyproject.org) Probably only useful to members of the Galaxy Team
+; building eggs for distribution. This file is used by
+; scripts/dist-scramble.py
;
-; More information: http://wiki.g2.bx.psu.edu/Admin/Config/Eggs
+; More information: http://wiki.galaxyproject.org/Admin/Config/Eggs
;
[hosts]
-py2.5-linux-i686-ucs2 = stegmaier.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-i686-ucs2/bin/python2.5
-py2.5-linux-i686-ucs4 = stegmaier.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-i686-ucs4/bin/python2.5
py2.6-linux-i686-ucs2 = stegmaier.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-i686-ucs2/bin/python2.6
py2.6-linux-i686-ucs4 = stegmaier.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-i686-ucs4/bin/python2.6
py2.7-linux-i686-ucs2 = stegmaier.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-i686-ucs2/bin/python2.7
py2.7-linux-i686-ucs4 = stegmaier.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-i686-ucs4/bin/python2.7
-py2.5-linux-x86_64-ucs2 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs2/bin/python2.5
-py2.5-linux-x86_64-ucs4 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs4/bin/python2.5
py2.6-linux-x86_64-ucs2 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs2/bin/python2.6
py2.6-linux-x86_64-ucs4 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs4/bin/python2.6
py2.7-linux-x86_64-ucs2 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs2/bin/python2.7
py2.7-linux-x86_64-ucs4 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs4/bin/python2.7
-py2.5-macosx-10.3-fat-ucs2 = weyerbacher.bx.psu.edu /Library/Frameworks/Python.framework/Versions/2.5/bin/python2.5
py2.6-macosx-10.3-fat-ucs2 = weyerbacher.bx.psu.edu /Library/Frameworks/Python.framework/Versions/2.6/bin/python2.6
py2.7-macosx-10.3-fat-ucs2 = weyerbacher.bx.psu.edu /Library/Frameworks/Python.framework/Versions/2.7/bin/python2.7
py2.6-macosx-10.6-universal-ucs2 = lion.bx.psu.edu /usr/bin/python2.6
py2.7-macosx-10.6-intel-ucs2 = lion.bx.psu.edu /usr/local/bin/python2.7
-py2.5-solaris-2.10-i86pc_32-ucs2 = thumper.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-i86pc_32-ucs2/bin/python2.5
-py2.6-solaris-2.10-i86pc_32-ucs2 = thumper.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-i86pc_32-ucs2/bin/python2.6
-py2.7-solaris-2.10-i86pc_32-ucs2 = thumper.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-i86pc_32-ucs2/bin/python2.7
-py2.5-solaris-2.10-i86pc_64-ucs2 = thumper.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-i86pc_64-ucs2/bin/python2.5
-py2.6-solaris-2.10-i86pc_64-ucs2 = thumper.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-i86pc_64-ucs2/bin/python2.6
-py2.7-solaris-2.10-i86pc_64-ucs2 = thumper.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-i86pc_64-ucs2/bin/python2.7
-py2.5-solaris-2.10-sun4u_32-ucs2 = early.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.8-sun4u_32-ucs2/bin/python2.5
-py2.6-solaris-2.10-sun4u_32-ucs2 = early.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.8-sun4u_32-ucs2/bin/python2.6
-py2.7-solaris-2.10-sun4u_32-ucs2 = early.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.8-sun4u_32-ucs2/bin/python2.7
-py2.5-solaris-2.10-sun4u_64-ucs2 = early.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-sun4u_64-ucs2/bin/python2.5
-py2.6-solaris-2.10-sun4u_64-ucs2 = early.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-sun4u_64-ucs2/bin/python2.6
-py2.7-solaris-2.10-sun4u_64-ucs2 = early.bx.psu.edu /afs/bx.psu.edu/project/pythons/solaris-2.10-sun4u_64-ucs2/bin/python2.7
-
+;
; these hosts are used to build eggs with no C extensions
-py2.5 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs4/bin/python2.5
py2.6 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs4/bin/python2.6
py2.7 = straub.bx.psu.edu /afs/bx.psu.edu/project/pythons/linux-x86_64-ucs4/bin/python2.7
[groups]
-py2.5-linux-i686 = py2.5-linux-i686-ucs2 py2.5-linux-i686-ucs4
-py2.5-linux-x86_64 = py2.5-linux-x86_64-ucs2 py2.5-linux-x86_64-ucs4
py2.6-linux-i686 = py2.6-linux-i686-ucs2 py2.6-linux-i686-ucs4
py2.6-linux-x86_64 = py2.6-linux-x86_64-ucs2 py2.6-linux-x86_64-ucs4
py2.7-linux-i686 = py2.7-linux-i686-ucs2 py2.7-linux-i686-ucs4
py2.7-linux-x86_64 = py2.7-linux-x86_64-ucs2 py2.7-linux-x86_64-ucs4
-py2.5-linux = py2.5-linux-i686 py2.5-linux-x86_64
py2.6-linux = py2.6-linux-i686 py2.6-linux-x86_64
py2.7-linux = py2.7-linux-i686 py2.7-linux-x86_64
-linux-i686 = py2.5-linux-i686 py2.6-linux-i686 py2.7-linux-i686
-linux-x86_64 = py2.5-linux-x86_64 py2.6-linux-x86_64 py2.7-linux-x86_64
+linux-i686 = py2.6-linux-i686 py2.7-linux-i686
+linux-x86_64 = py2.6-linux-x86_64 py2.7-linux-x86_64
linux = linux-i686 linux-x86_64
-py2.5-macosx = py2.5-macosx-10.3-fat-ucs2
py2.6-macosx = py2.6-macosx-10.3-fat-ucs2 py2.6-macosx-10.6-universal-ucs2
py2.7-macosx = py2.7-macosx-10.3-fat-ucs2 py2.7-macosx-10.6-intel-ucs2
-macosx = py2.5-macosx py2.6-macosx py2.7-macosx
-py2.5-solaris-i86pc = py2.5-solaris-2.10-i86pc_32-ucs2 py2.5-solaris-2.10-i86pc_64-ucs2
-py2.6-solaris-i86pc = py2.6-solaris-2.10-i86pc_32-ucs2 py2.6-solaris-2.10-i86pc_64-ucs2
-py2.7-solaris-i86pc = py2.7-solaris-2.10-i86pc_32-ucs2 py2.7-solaris-2.10-i86pc_64-ucs2
-py2.5-solaris-sun4u = py2.5-solaris-2.10-sun4u_32-ucs2 py2.5-solaris-2.10-sun4u_64-ucs2
-py2.6-solaris-sun4u = py2.6-solaris-2.10-sun4u_32-ucs2 py2.6-solaris-2.10-sun4u_64-ucs2
-py2.7-solaris-sun4u = py2.7-solaris-2.10-sun4u_32-ucs2 py2.7-solaris-2.10-sun4u_64-ucs2
-py2.5-solaris = py2.5-solaris-i86pc py2.5-solaris-sun4u
-py2.6-solaris = py2.6-solaris-i86pc py2.6-solaris-sun4u
-py2.7-solaris = py2.7-solaris-i86pc py2.7-solaris-sun4u
-solaris-i86pc = py2.5-solaris-i86pc py2.6-solaris-i86pc py2.7-solaris-i86pc
-solaris-sun4u = py2.5-solaris-sun4u py2.6-solaris-sun4u py2.7-solaris-sun4u
-solaris = solaris-i86pc solaris-sun4u
-py2.5-all = py2.5-linux py2.5-macosx py2.5-solaris
-py2.6-all = py2.6-linux py2.6-macosx py2.6-solaris
-py2.7-all = py2.7-linux py2.7-macosx py2.7-solaris
+macosx = py2.6-macosx py2.7-macosx
+py2.6-all = py2.6-linux py2.6-macosx
+py2.7-all = py2.7-linux py2.7-macosx
; the 'all' key is used internally by the build system to specify which hosts
; to build on when no hosts are specified on the dist-eggs.py command line.
-all = linux macosx solaris
+all = linux macosx
; the 'noplatform' key, likewise, is for which build hosts should be used when
; building pure python (noplatform) eggs.
-noplatform = py2.5 py2.6 py2.7
+noplatform = py2.6 py2.7
; don't build these eggs on these platforms:
[ignore]
-ctypes = py2.5-linux-i686-ucs2 py2.5-linux-i686-ucs4 py2.6-linux-i686-ucs2 py2.6-linux-i686-ucs4 py2.7-linux-i686-ucs2 py2.7-linux-i686-ucs4 py2.5-linux-x86_64-ucs2 py2.5-linux-x86_64-ucs4 py2.6-linux-x86_64-ucs2 py2.6-linux-x86_64-ucs4 py2.7-linux-x86_64-ucs2 py2.7-linux-x86_64-ucs4 py2.5-macosx-10.3-fat-ucs2 py2.6-macosx-10.3-fat-ucs2 py2.6-macosx-10.6-universal-ucs2 py2.7-macosx-10.3-fat-ucs2 py2.5-solaris-2.10-i86pc_32-ucs2 py2.6-solaris-2.10-i86pc_32-ucs2 py2.7-solaris-2.10-i86pc_32-ucs2 py2.5-solaris-2.10-i86pc_64-ucs2 py2.6-solaris-2.10-i86pc_64-ucs2 py2.7-solaris-2.10-i86pc_64-ucs2 py2.5-solaris-2.10-sun4u_32-ucs2 py2.6-solaris-2.10-sun4u_32-ucs2 py2.7-solaris-2.10-sun4u_32-ucs2 py2.5-solaris-2.10-sun4u_64-ucs2 py2.6-solaris-2.10-sun4u_64-ucs2 py2.7-solaris-2.10-sun4u_64-ucs2
+;ctypes =
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e eggs.ini
--- a/eggs.ini
+++ b/eggs.ini
@@ -21,7 +21,7 @@
PyRods = 3.2.4
numpy = 1.6.0
pbs_python = 4.3.5
-psycopg2 = 2.0.13
+psycopg2 = 2.5.1
pycrypto = 2.5
pysam = 0.4.2
pysqlite = 2.5.6
@@ -38,15 +38,16 @@
boto = 2.5.2
decorator = 3.1.2
docutils = 0.7
-drmaa = 0.4b3
+drmaa = 0.6
elementtree = 1.2.6_20050316
-Fabric = 1.4.2
+Fabric = 1.7.0
GeneTrack = 2.0.0_beta_1
lrucache = 0.2
Mako = 0.4.1
nose = 0.11.1
NoseHTML = 0.4.1
NoseTestDiff = 0.1
+paramiko = 1.11.1
Parsley = 1.1
Paste = 1.7.5.1
PasteDeploy = 1.5.0
@@ -71,7 +72,7 @@
; extra version information
[tags]
-psycopg2 = _8.4.2_static
+psycopg2 = _9.2.4_static
pysqlite = _3.6.17_static
MySQL_python = _5.1.41_static
bx_python = _7b95ff194725
@@ -82,7 +83,7 @@
; the wiki page above
[source]
MySQL_python = mysql-5.1.41
-psycopg2 = postgresql-8.4.2
+psycopg2 = postgresql-9.2.4
pysqlite = sqlite-amalgamation-3_6_17
[dependencies]
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/app.py
--- a/lib/galaxy/app.py
+++ b/lib/galaxy/app.py
@@ -172,6 +172,7 @@
self.job_stop_queue = self.job_manager.job_stop_queue
# Initialize the external service types
self.external_service_types = external_service_types.ExternalServiceTypesCollection( self.config.external_service_type_config_file, self.config.external_service_type_path, self )
+ self.model.engine.dispose()
def shutdown( self ):
self.job_manager.shutdown()
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/config.py
--- a/lib/galaxy/config.py
+++ b/lib/galaxy/config.py
@@ -63,7 +63,7 @@
elif 'tool_config_files' in kwargs:
tcf = kwargs[ 'tool_config_files' ]
else:
- tcf = 'tool_conf.xml'
+ tcf = 'tool_conf.xml,shed_tool_conf.xml'
self.tool_filters = listify( kwargs.get( "tool_filters", [] ) )
self.tool_label_filters = listify( kwargs.get( "tool_label_filters", [] ) )
self.tool_section_filters = listify( kwargs.get( "tool_section_filters", [] ) )
@@ -132,6 +132,23 @@
self.admin_users = kwargs.get( "admin_users", "" )
self.mailing_join_addr = kwargs.get('mailing_join_addr',"galaxy-announce-join(a)bx.psu.edu")
self.error_email_to = kwargs.get( 'error_email_to', None )
+ self.activation_email = kwargs.get( 'activation_email', None )
+ self.user_activation_on = string_as_bool( kwargs.get( 'user_activation_on', False ) )
+ self.activation_grace_period = kwargs.get( 'activation_grace_period', None )
+ self.inactivity_box_content = kwargs.get( 'inactivity_box_content', None )
+ self.terms_url = kwargs.get( 'terms_url', None )
+ self.instance_resource_url = kwargs.get( 'instance_resource_url', None )
+ self.registration_warning_message = kwargs.get( 'registration_warning_message', None )
+ # Get the disposable email domains blacklist file and its contents
+ self.blacklist_location = kwargs.get( 'blacklist_file', None )
+ self.blacklist_content = None
+ if self.blacklist_location is not None:
+ self.blacklist_file = resolve_path( kwargs.get( 'blacklist_file', None ), self.root )
+ try:
+ with open( self.blacklist_file ) as blacklist:
+ self.blacklist_content = [ line.rstrip() for line in blacklist.readlines() ]
+ except IOError:
+ print ( "CONFIGURATION ERROR: Can't open supplied blacklist file from path: " + str( self.blacklist_file ) )
self.smtp_server = kwargs.get( 'smtp_server', None )
self.smtp_username = kwargs.get( 'smtp_username', None )
self.smtp_password = kwargs.get( 'smtp_password', None )
@@ -175,7 +192,7 @@
self.message_box_content = kwargs.get( 'message_box_content', None )
self.message_box_class = kwargs.get( 'message_box_class', 'info' )
self.support_url = kwargs.get( 'support_url', 'http://wiki.g2.bx.psu.edu/Support' )
- self.wiki_url = kwargs.get( 'wiki_url', 'http://g2.trac.bx.psu.edu/' )
+ self.wiki_url = kwargs.get( 'wiki_url', 'http://wiki.galaxyproject.org/' )
self.blog_url = kwargs.get( 'blog_url', None )
self.screencasts_url = kwargs.get( 'screencasts_url', None )
self.library_import_dir = kwargs.get( 'library_import_dir', None )
@@ -227,6 +244,9 @@
self.os_is_secure = string_as_bool( kwargs.get( 'os_is_secure', True ) )
self.os_conn_path = kwargs.get( 'os_conn_path', '/' )
self.object_store_cache_size = float(kwargs.get( 'object_store_cache_size', -1 ))
+ self.object_store_config_file = kwargs.get( 'object_store_config_file', None )
+ if self.object_store_config_file is not None:
+ self.object_store_config_file = resolve_path( self.object_store_config_file, self.root )
self.distributed_object_store_config_file = kwargs.get( 'distributed_object_store_config_file', None )
if self.distributed_object_store_config_file is not None:
self.distributed_object_store_config_file = resolve_path( self.distributed_object_store_config_file, self.root )
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/datatypes/converters/bed_gff_or_vcf_to_bigwig_converter.xml
--- a/lib/galaxy/datatypes/converters/bed_gff_or_vcf_to_bigwig_converter.xml
+++ b/lib/galaxy/datatypes/converters/bed_gff_or_vcf_to_bigwig_converter.xml
@@ -9,7 +9,12 @@
grep -v '^#' $input | sort -k1,1 |
## Generate coverage bedgraph.
- bedtools genomecov -bg -split -i stdin -g $chromInfo
+ bedtools genomecov -bg -i stdin -g $chromInfo
+
+ ## Only use split option for bed and gff/gff3/gtf.
+ #if $input.ext in [ 'bed', 'gff', 'gff3', 'gtf' ]:
+ -split
+ #end if
## Streaming the bedgraph file to wigToBigWig is fast but very memory intensive; hence, this
## should only be used on systems with large RAM.
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/datatypes/converters/fastq_to_fqtoc.py
--- a/lib/galaxy/datatypes/converters/fastq_to_fqtoc.py
+++ b/lib/galaxy/datatypes/converters/fastq_to_fqtoc.py
@@ -28,7 +28,7 @@
lines_per_chunk = 4*sequences
chunk_begin = 0
- in_file = open(input_name)
+ in_file = open(input_fname)
out_file.write('{"sections" : [');
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/datatypes/converters/lped_to_fped_converter.py
--- a/lib/galaxy/datatypes/converters/lped_to_fped_converter.py
+++ b/lib/galaxy/datatypes/converters/lped_to_fped_converter.py
@@ -14,7 +14,7 @@
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
-<meta name="generator" content="Galaxy %s tool output - see http://g2.trac.bx.psu.edu/" />
+<meta name="generator" content="Galaxy %s tool output - see http://getgalaxy.org" /><title></title><link rel="stylesheet" href="/static/style/base.css" type="text/css" /></head>
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/datatypes/converters/lped_to_pbed_converter.py
--- a/lib/galaxy/datatypes/converters/lped_to_pbed_converter.py
+++ b/lib/galaxy/datatypes/converters/lped_to_pbed_converter.py
@@ -15,7 +15,7 @@
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
-<meta name="generator" content="Galaxy %s tool output - see http://g2.trac.bx.psu.edu/" />
+<meta name="generator" content="Galaxy %s tool output - see http://getgalaxy.org" /><title></title><link rel="stylesheet" href="/static/style/base.css" type="text/css" /></head>
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/datatypes/converters/pbed_ldreduced_converter.py
--- a/lib/galaxy/datatypes/converters/pbed_ldreduced_converter.py
+++ b/lib/galaxy/datatypes/converters/pbed_ldreduced_converter.py
@@ -13,7 +13,7 @@
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
-<meta name="generator" content="Galaxy %s tool output - see http://g2.trac.bx.psu.edu/" />
+<meta name="generator" content="Galaxy %s tool output - see http://getgalaxy.org" /><title></title><link rel="stylesheet" href="/static/style/base.css" type="text/css" /></head>
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/datatypes/converters/pbed_to_lped_converter.py
--- a/lib/galaxy/datatypes/converters/pbed_to_lped_converter.py
+++ b/lib/galaxy/datatypes/converters/pbed_to_lped_converter.py
@@ -15,7 +15,7 @@
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
-<meta name="generator" content="Galaxy %s tool output - see http://g2.trac.bx.psu.edu/" />
+<meta name="generator" content="Galaxy %s tool output - see http://getgalaxy.org" /><title></title><link rel="stylesheet" href="/static/style/base.css" type="text/css" /></head>
@@ -66,7 +66,7 @@
f = file(outhtmlname,'w')
f.write(galhtmlprefix % prog)
flist = os.listdir(outfilepath)
- s = '## Rgenetics: http://rgenetics.org Galaxy Tools %s %s' % (prog,timenow()) # becomes info
+ s = '## Rgenetics: http://bitbucket.org/rgalaxy Galaxy Tools %s %s' % (prog,timenow()) # becomes info
print s
f.write('<div>%s\n<ol>' % (s))
for i, data in enumerate( flist ):
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/datatypes/tabular.py
--- a/lib/galaxy/datatypes/tabular.py
+++ b/lib/galaxy/datatypes/tabular.py
@@ -33,8 +33,6 @@
MetadataElement( name="column_types", default=[], desc="Column types", param=metadata.ColumnTypesParameter, readonly=True, visible=False, no_value=[] )
MetadataElement( name="column_names", default=[], desc="Column names", readonly=True, visible=False, optional=True, no_value=[] )
- def init_meta( self, dataset, copy_from=None ):
- data.Text.init_meta( self, dataset, copy_from=copy_from )
def set_meta( self, dataset, overwrite = True, skip = None, max_data_lines = 100000, max_guess_type_data_lines = None, **kwd ):
"""
Tries to determine the number of columns as well as those columns that
@@ -199,10 +197,13 @@
if not column_names and dataset.metadata.column_names:
column_names = dataset.metadata.column_names
- column_headers = [None] * dataset.metadata.columns
+ columns = dataset.metadata.columns
+ if columns is None:
+ columns = dataset.metadata.spec.columns.no_value
+ column_headers = [None] * columns
# fill in empty headers with data from column_names
- for i in range( min( dataset.metadata.columns, len( column_names ) ) ):
+ for i in range( min( columns, len( column_names ) ) ):
if column_headers[i] is None and column_names[i] is not None:
column_headers[i] = column_names[i]
@@ -213,7 +214,7 @@
i = int( getattr( dataset.metadata, name ) ) - 1
except:
i = -1
- if 0 <= i < dataset.metadata.columns and column_headers[i] is None:
+ if 0 <= i < columns and column_headers[i] is None:
column_headers[i] = column_parameter_alias.get(name, name)
out.append( '<tr>' )
@@ -236,13 +237,16 @@
try:
if not dataset.peek:
dataset.set_peek()
+ columns = dataset.metadata.columns
+ if columns is None:
+ columns = dataset.metadata.spec.columns.no_value
for line in dataset.peek.splitlines():
if line.startswith( tuple( skipchars ) ):
out.append( '<tr><td colspan="100%%">%s</td></tr>' % escape( line ) )
elif line:
elems = line.split( '\t' )
# we may have an invalid comment line or invalid data
- if len( elems ) != dataset.metadata.columns:
+ if len( elems ) != columns:
out.append( '<tr><td colspan="100%%">%s</td></tr>' % escape( line ) )
else:
out.append( '<tr>' )
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/eggs/__init__.py
--- a/lib/galaxy/eggs/__init__.py
+++ b/lib/galaxy/eggs/__init__.py
@@ -378,7 +378,7 @@
return True
else:
try:
- return { "psycopg2": lambda: self.config.get( "app:main", "database_connection" ).startswith( "postgres://" ),
+ return { "psycopg2": lambda: self.config.get( "app:main", "database_connection" ).startswith( "postgres" ),
"MySQL_python": lambda: self.config.get( "app:main", "database_connection" ).startswith( "mysql://" ),
"DRMAA_python": lambda: "sge" in self.config.get( "app:main", "start_job_runners" ).split(","),
"drmaa": lambda: "drmaa" in self.config.get( "app:main", "start_job_runners" ).split(","),
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/jobs/__init__.py
--- a/lib/galaxy/jobs/__init__.py
+++ b/lib/galaxy/jobs/__init__.py
@@ -2,6 +2,7 @@
Support for running a tool in Galaxy via an internal job management system
"""
+import time
import copy
import datetime
import galaxy
@@ -17,7 +18,7 @@
import traceback
from galaxy import model, util
from galaxy.datatypes import metadata
-from galaxy.exceptions import ObjectInvalid
+from galaxy.exceptions import ObjectInvalid, ObjectNotFound
from galaxy.jobs.actions.post import ActionBox
from galaxy.jobs.mapper import JobRunnerMapper
from galaxy.jobs.runners import BaseJobRunner
@@ -179,7 +180,7 @@
if tools is not None:
for tool in self.__findall_with_required(tools, 'tool'):
# There can be multiple definitions with identical ids, but different params
- id = tool.get('id')
+ id = tool.get('id').lower()
if id not in self.tools:
self.tools[id] = list()
self.tools[id].append(JobToolConfiguration(**dict(tool.items())))
@@ -926,6 +927,17 @@
context = self.get_dataset_finish_context( job_context, dataset_assoc.dataset.dataset )
#should this also be checking library associations? - can a library item be added from a history before the job has ended? - lets not allow this to occur
for dataset in dataset_assoc.dataset.dataset.history_associations + dataset_assoc.dataset.dataset.library_associations: #need to update all associated output hdas, i.e. history was shared with job running
+ trynum = 0
+ while trynum < self.app.config.retry_job_output_collection:
+ try:
+ # Attempt to short circuit NFS attribute caching
+ os.stat( dataset.dataset.file_name )
+ os.chown( dataset.dataset.file_name, os.getuid(), -1 )
+ trynum = self.app.config.retry_job_output_collection
+ except ( OSError, ObjectNotFound ), e:
+ trynum += 1
+ log.warning( 'Error accessing %s, will retry: %s', dataset.dataset.file_name, e )
+ time.sleep( 2 )
dataset.blurb = 'done'
dataset.peek = 'no peek'
dataset.info = (dataset.info or '')
@@ -1483,6 +1495,12 @@
just copy these files directly to the ulimate destination.
"""
return output_path
+
+ @property
+ def requires_setting_metadata( self ):
+ if self.tool:
+ return self.tool.requires_setting_metadata
+ return False
class TaskWrapper(JobWrapper):
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/jobs/handler.py
--- a/lib/galaxy/jobs/handler.py
+++ b/lib/galaxy/jobs/handler.py
@@ -84,12 +84,25 @@
Checks all jobs that are in the 'new', 'queued' or 'running' state in
the database and requeues or cleans up as necessary. Only run as the
job handler starts.
+ In case the activation is enforced it will filter out the jobs of inactive users.
"""
- for job in self.sa_session.query( model.Job ).enable_eagerloads( False ) \
+ jobs_at_startup = []
+ if self.app.config.user_activation_on:
+ jobs_at_startup = self.sa_session.query( model.Job ).enable_eagerloads( False ) \
+ .outerjoin( model.User ) \
.filter( ( ( model.Job.state == model.Job.states.NEW ) \
| ( model.Job.state == model.Job.states.RUNNING ) \
| ( model.Job.state == model.Job.states.QUEUED ) ) \
- & ( model.Job.handler == self.app.config.server_name ) ):
+ & ( model.Job.handler == self.app.config.server_name ) \
+ & or_( ( model.Job.user_id == None ),( model.User.active == True ) ) ).all()
+ else:
+ jobs_at_startup = self.sa_session.query( model.Job ).enable_eagerloads( False ) \
+ .filter( ( ( model.Job.state == model.Job.states.NEW ) \
+ | ( model.Job.state == model.Job.states.RUNNING ) \
+ | ( model.Job.state == model.Job.states.QUEUED ) ) \
+ & ( model.Job.handler == self.app.config.server_name ) ).all()
+
+ for job in jobs_at_startup:
if job.tool_id not in self.app.toolbox.tools_by_id:
log.warning( "(%s) Tool '%s' removed from tool config, unable to recover job" % ( job.id, job.tool_id ) )
JobWrapper( job, self ).fail( 'This tool was disabled before the job completed. Please contact your Galaxy administrator.' )
@@ -146,8 +159,9 @@
over all new and waiting jobs to check the state of the jobs each
depends on. If the job has dependencies that have not finished, it
it goes to the waiting queue. If the job has dependencies with errors,
- it is marked as having errors and removed from the queue. Otherwise,
- the job is dispatched.
+ it is marked as having errors and removed from the queue. If the job
+ belongs to an inactive user it is ignored.
+ Otherwise, the job is dispatched.
"""
# Pull all new jobs from the queue at once
jobs_to_check = []
@@ -173,7 +187,17 @@
(model.LibraryDatasetDatasetAssociation.deleted == True),
(model.Dataset.state != model.Dataset.states.OK),
(model.Dataset.deleted == True)))).subquery()
- jobs_to_check = self.sa_session.query(model.Job).enable_eagerloads(False) \
+ if self.app.config.user_activation_on:
+ jobs_to_check = self.sa_session.query(model.Job).enable_eagerloads(False) \
+ .outerjoin( model.User ) \
+ .filter(and_((model.Job.state == model.Job.states.NEW),
+ or_((model.Job.user_id == None),(model.User.active == True)),
+ (model.Job.handler == self.app.config.server_name),
+ ~model.Job.table.c.id.in_(hda_not_ready),
+ ~model.Job.table.c.id.in_(ldda_not_ready))) \
+ .order_by(model.Job.id).all()
+ else:
+ jobs_to_check = self.sa_session.query(model.Job).enable_eagerloads(False) \
.filter(and_((model.Job.state == model.Job.states.NEW),
(model.Job.handler == self.app.config.server_name),
~model.Job.table.c.id.in_(hda_not_ready),
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/jobs/runners/__init__.py
--- a/lib/galaxy/jobs/runners/__init__.py
+++ b/lib/galaxy/jobs/runners/__init__.py
@@ -167,16 +167,28 @@
if job_wrapper.dependency_shell_commands:
commands = "; ".join( job_wrapper.dependency_shell_commands + [ commands ] )
+ # Coping work dir outputs or setting metadata will mask return code of
+ # tool command. If these are used capture the return code and ensure
+ # the last thing that happens is an exit with return code.
+ capture_return_code_command = "; return_code=$?"
+ captured_return_code = False
+
# Append commands to copy job outputs based on from_work_dir attribute.
if include_work_dir_outputs:
work_dir_outputs = self.get_work_dir_outputs( job_wrapper )
if work_dir_outputs:
+ if not captured_return_code:
+ commands += capture_return_code_command
+ captured_return_code = True
commands += "; " + "; ".join( [ "if [ -f %s ] ; then cp %s %s ; fi" %
( source_file, source_file, destination ) for ( source_file, destination ) in work_dir_outputs ] )
# Append metadata setting commands, we don't want to overwrite metadata
# that was copied over in init_meta(), as per established behavior
- if include_metadata:
+ if include_metadata and job_wrapper.requires_setting_metadata:
+ if not captured_return_code:
+ commands += capture_return_code_command
+ captured_return_code = True
commands += "; cd %s; " % os.path.abspath( os.getcwd() )
commands += job_wrapper.setup_external_metadata(
exec_dir = os.path.abspath( os.getcwd() ),
@@ -185,6 +197,11 @@
output_fnames = job_wrapper.get_output_fnames(),
set_extension = False,
kwds = { 'overwrite' : False } )
+
+
+ if captured_return_code:
+ commands += '; sh -c "exit $return_code"'
+
return commands
def get_work_dir_outputs( self, job_wrapper ):
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/jobs/runners/drmaa.py
--- a/lib/galaxy/jobs/runners/drmaa.py
+++ b/lib/galaxy/jobs/runners/drmaa.py
@@ -166,7 +166,17 @@
# runJob will raise if there's a submit problem
if self.external_runJob_script is None:
- external_job_id = self.ds.runJob(jt)
+ # TODO: create a queue for retrying submission indefinitely
+ # TODO: configurable max tries and sleep
+ trynum = 0
+ external_job_id = None
+ while external_job_id is None and trynum < 5:
+ try:
+ external_job_id = self.ds.runJob(jt)
+ except drmaa.InternalException, e:
+ trynum += 1
+ log.warning( '(%s) drmaa.Session.runJob() failed, will retry: %s', galaxy_id_tag, e )
+ time.sleep( 5 )
else:
job_wrapper.change_ownership_for_run()
log.debug( '(%s) submitting with credentials: %s [uid: %s]' % ( galaxy_id_tag, job_wrapper.user_system_pwent[0], job_wrapper.user_system_pwent[2] ) )
@@ -226,7 +236,13 @@
if state == drmaa.JobState.RUNNING and not ajs.running:
ajs.running = True
ajs.job_wrapper.change_state( model.Job.states.RUNNING )
- if state in ( drmaa.JobState.DONE, drmaa.JobState.FAILED ):
+ if state == drmaa.JobState.FAILED:
+ if ajs.job_wrapper.get_state() != model.Job.states.DELETED:
+ ajs.stop_job = False
+ ajs.fail_message = "The cluster DRM system terminated this job"
+ self.work_queue.put( ( self.fail_job, ajs ) )
+ continue
+ if state == drmaa.JobState.DONE:
if ajs.job_wrapper.get_state() != model.Job.states.DELETED:
self.work_queue.put( ( self.finish_job, ajs ) )
continue
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -79,6 +79,8 @@
self.external = False
self.deleted = False
self.purged = False
+ self.active = False
+ self.activation_token = None
self.username = None
# Relationships
self.histories = []
@@ -1681,7 +1683,7 @@
rval += child.get_disk_usage( user )
return rval
- def to_dict( self, view='collection' ):
+ def to_dict( self, view='collection', expose_dataset_path=False ):
"""
Return attributes of this HDA that are exposed using the API.
"""
@@ -1714,6 +1716,9 @@
for name, spec in hda.metadata.spec.items():
val = hda.metadata.get( name )
if isinstance( val, MetadataFile ):
+ # only when explicitly set: fetching filepaths can be expensive
+ if not expose_dataset_path:
+ continue
val = val.file_name
# If no value for metadata, look in datatype for metadata.
elif val == None and hasattr( hda.datatype, name ):
@@ -2324,9 +2329,12 @@
self.id = None
self.user = None
+
class StoredWorkflow( object, Dictifiable):
+
dict_collection_visible_keys = ( 'id', 'name', 'published' )
dict_element_visible_keys = ( 'id', 'name', 'published' )
+
def __init__( self ):
self.id = None
self.user = None
@@ -2343,7 +2351,7 @@
self.tags.append(new_swta)
def to_dict( self, view='collection', value_mapper = None ):
- rval = super( StoredWorkflow, self ).to_dict(self, view=view, value_mapper = value_mapper)
+ rval = super( StoredWorkflow, self ).to_dict( view=view, value_mapper = value_mapper )
tags_str_list = []
for tag in self.tags:
tag_str = tag.user_tname
@@ -2354,7 +2362,11 @@
return rval
-class Workflow( object ):
+class Workflow( object, Dictifiable ):
+
+ dict_collection_visible_keys = ( 'name', 'has_cycles', 'has_errors' )
+ dict_element_visible_keys = ( 'name', 'has_cycles', 'has_errors' )
+
def __init__( self ):
self.user = None
self.name = None
@@ -2362,7 +2374,9 @@
self.has_errors = None
self.steps = []
+
class WorkflowStep( object ):
+
def __init__( self ):
self.id = None
self.type = None
@@ -2373,36 +2387,48 @@
self.input_connections = []
self.config = None
+
class WorkflowStepConnection( object ):
+
def __init__( self ):
self.output_step_id = None
self.output_name = None
self.input_step_id = None
self.input_name = None
+
class WorkflowOutput(object):
+
def __init__( self, workflow_step, output_name):
self.workflow_step = workflow_step
self.output_name = output_name
+
class StoredWorkflowUserShareAssociation( object ):
+
def __init__( self ):
self.stored_workflow = None
self.user = None
+
class StoredWorkflowMenuEntry( object ):
+
def __init__( self ):
self.stored_workflow = None
self.user = None
self.order_index = None
+
class WorkflowInvocation( object ):
pass
+
class WorkflowInvocationStep( object ):
pass
+
class MetadataFile( object ):
+
def __init__( self, dataset = None, name = None ):
if isinstance( dataset, HistoryDatasetAssociation ):
self.history_dataset = dataset
@@ -3886,7 +3912,7 @@
return [ tool_version.tool_id for tool_version in self.get_versions( app ) ]
def to_dict( self, view='element' ):
- rval = super( ToolVersion, self ).to_dict( self, view )
+ rval = super( ToolVersion, self ).to_dict( view=view )
rval['tool_name'] = self.tool_id
for a in self.parent_tool_association:
rval['parent_tool_id'] = a.parent_id
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/model/custom_types.py
--- a/lib/galaxy/model/custom_types.py
+++ b/lib/galaxy/model/custom_types.py
@@ -29,6 +29,8 @@
try:
if value[0] == 'x':
return binascii.unhexlify(value[1:])
+ elif value.startswith( '\\x' ):
+ return binascii.unhexlify( value[2:] )
else:
return value
except Exception, ex:
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py
+++ b/lib/galaxy/model/mapping.py
@@ -52,7 +52,9 @@
Column( "form_values_id", Integer, ForeignKey( "form_values.id" ), index=True ),
Column( "deleted", Boolean, index=True, default=False ),
Column( "purged", Boolean, index=True, default=False ),
- Column( "disk_usage", Numeric( 15, 0 ), index=True ) )
+ Column( "disk_usage", Numeric( 15, 0 ), index=True ) ,
+ Column( "active", Boolean, index=True, default=True, nullable=False ),
+ Column( "activation_token", TrimmedString( 64 ), nullable=True, index=True ) )
model.UserAddress.table = Table( "user_address", metadata,
Column( "id", Integer, primary_key=True),
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/model/migrate/versions/0117_add_user_activation.py
--- /dev/null
+++ b/lib/galaxy/model/migrate/versions/0117_add_user_activation.py
@@ -0,0 +1,57 @@
+'''
+Created on Sep 10, 2013
+
+@author: marten
+
+Adds 'active' and 'activation_token' columns to the galaxy_user table.
+'''
+
+from sqlalchemy import *
+from sqlalchemy.orm import *
+from migrate import *
+from migrate.changeset import *
+from galaxy.model.custom_types import TrimmedString
+
+import logging
+log = logging.getLogger( __name__ )
+
+user_active_column = Column( "active", Boolean, default=True, nullable=True )
+user_activation_token_column = Column( "activation_token", TrimmedString( 64 ), nullable=True )
+
+
+def display_migration_details():
+ print ""
+ print "This migration script adds active and activation_token columns to the user table"
+
+def upgrade(migrate_engine):
+ print __doc__
+ metadata = MetaData()
+ metadata.bind = migrate_engine
+ metadata.reflect()
+
+ # Add the active and activation_token columns to the user table in one try because the depend on each other.
+ try:
+ user_table = Table( "galaxy_user", metadata, autoload=True )
+ user_active_column.create( table = user_table , populate_default = True)
+ user_activation_token_column.create( table = user_table )
+ assert user_active_column is user_table.c.active
+ assert user_activation_token_column is user_table.c.activation_token
+ except Exception, e:
+ print str(e)
+ log.error( "Adding columns 'active' and 'activation_token' to galaxy_user table failed: %s" % str( e ) )
+ return
+
+def downgrade(migrate_engine):
+ metadata = MetaData()
+ metadata.bind = migrate_engine
+ metadata.reflect()
+
+ # Drop the user table's active and activation_token columns in one try because the depend on each other.
+ try:
+ user_table = Table( "galaxy_user", metadata, autoload=True )
+ user_active = user_table.c.active
+ user_activation_token = user_table.c.activation_token
+ user_active.drop()
+ user_activation_token.drop()
+ except Exception, e:
+ log.debug( "Dropping 'active' and 'activation_token' columns from galaxy_user table failed: %s" % ( str( e ) ) )
\ No newline at end of file
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/objectstore/__init__.py
--- a/lib/galaxy/objectstore/__init__.py
+++ b/lib/galaxy/objectstore/__init__.py
@@ -5,7 +5,6 @@
"""
import os
-import time
import random
import shutil
import logging
@@ -15,6 +14,7 @@
from galaxy.jobs import Sleeper
from galaxy.model import directory_hash_id
from galaxy.exceptions import ObjectNotFound, ObjectInvalid
+from galaxy.util.odict import odict
from sqlalchemy.orm import object_session
@@ -25,7 +25,7 @@
"""
ObjectStore abstract interface
"""
- def __init__(self, config, **kwargs):
+ def __init__(self, config, config_xml=None, **kwargs):
self.running = True
self.extra_dirs = {}
@@ -188,19 +188,26 @@
>>> import tempfile
>>> file_path=tempfile.mkdtemp()
>>> obj = Bunch(id=1)
- >>> s = DiskObjectStore(Bunch(umask=077, job_working_directory=file_path, new_file_path=file_path), file_path=file_path)
+ >>> s = DiskObjectStore(Bunch(umask=077, job_working_directory=file_path, new_file_path=file_path, object_store_check_old_style=False), file_path=file_path)
>>> s.create(obj)
>>> s.exists(obj)
True
>>> assert s.get_filename(obj) == file_path + '/000/dataset_1.dat'
"""
- def __init__(self, config, file_path=None, extra_dirs=None):
- super(DiskObjectStore, self).__init__(config, file_path=file_path, extra_dirs=extra_dirs)
+ def __init__(self, config, config_xml=None, file_path=None, extra_dirs=None):
+ super(DiskObjectStore, self).__init__(config, config_xml=None, file_path=file_path, extra_dirs=extra_dirs)
self.file_path = file_path or config.file_path
self.config = config
self.check_old_style = config.object_store_check_old_style
self.extra_dirs['job_work'] = config.job_working_directory
self.extra_dirs['temp'] = config.new_file_path
+ #The new config_xml overrides universe settings.
+ if config_xml:
+ for e in config_xml:
+ if e.tag == 'files_dir':
+ self.file_path = e.get('path')
+ else:
+ self.extra_dirs[e.tag] = e.get('path')
if extra_dirs is not None:
self.extra_dirs.update( extra_dirs )
@@ -364,29 +371,89 @@
super(CachingObjectStore, self).__init__(self, path, backend)
-class DistributedObjectStore(ObjectStore):
+class NestedObjectStore(ObjectStore):
+ """
+ Base for ObjectStores that use other ObjectStores
+ (DistributedObjectStore, HierarchicalObjectStore)
+ """
+
+ def __init__(self, config, config_xml=None):
+ super(NestedObjectStore, self).__init__(config, config_xml=config_xml)
+ self.backends = {}
+
+ def shutdown(self):
+ for store in self.backends.values():
+ store.shutdown()
+ super(NestedObjectStore, self).shutdown()
+
+ def exists(self, obj, **kwargs):
+ return self.__call_method('exists', obj, False, False, **kwargs)
+
+ def file_ready(self, obj, **kwargs):
+ return self.__call_method('file_ready', obj, False, False, **kwargs)
+
+ def create(self, obj, **kwargs):
+ random.choice(self.backends.values()).create(obj, **kwargs)
+
+ def empty(self, obj, **kwargs):
+ return self.__call_method('empty', obj, True, False, **kwargs)
+
+ def size(self, obj, **kwargs):
+ return self.__call_method('size', obj, 0, False, **kwargs)
+
+ def delete(self, obj, **kwargs):
+ return self.__call_method('delete', obj, False, False, **kwargs)
+
+ def get_data(self, obj, **kwargs):
+ return self.__call_method('get_data', obj, ObjectNotFound, True, **kwargs)
+
+ def get_filename(self, obj, **kwargs):
+ return self.__call_method('get_filename', obj, ObjectNotFound, True, **kwargs)
+
+ def update_from_file(self, obj, **kwargs):
+ if kwargs.get('create', False):
+ self.create(obj, **kwargs)
+ kwargs['create'] = False
+ return self.__call_method('update_from_file', obj, ObjectNotFound, True, **kwargs)
+
+ def get_object_url(self, obj, **kwargs):
+ return self.__call_method('get_object_url', obj, None, False, **kwargs)
+
+ def __call_method(self, method, obj, default, default_is_exception, **kwargs):
+ """
+ Check all children object stores for the first one with the dataset
+ """
+ for key, store in self.backends.items():
+ if store.exists(obj, **kwargs):
+ return store.__getattribute__(method)(obj, **kwargs)
+ if default_is_exception:
+ raise default( 'objectstore, __call_method failed: %s on %s, kwargs: %s'
+ %( method, str( obj ), str( kwargs ) ) )
+ else:
+ return default
+
+
+class DistributedObjectStore(NestedObjectStore):
"""
ObjectStore that defers to a list of backends, for getting objects the
first store where the object exists is used, objects are created in a
store selected randomly, but with weighting.
"""
- def __init__(self, config, fsmon=False):
- super(DistributedObjectStore, self).__init__(config)
- self.distributed_config = config.distributed_object_store_config_file
- assert self.distributed_config is not None, "distributed object store ('object_store = distributed') " \
- "requires a config file, please set one in " \
- "'distributed_object_store_config_file')"
+ def __init__(self, config, config_xml=None, fsmon=False):
+ super(DistributedObjectStore, self).__init__(config, config_xml=config_xml)
+ if not config_xml:
+ self.distributed_config = config.distributed_object_store_config_file
+ assert self.distributed_config is not None, "distributed object store ('object_store = distributed') " \
+ "requires a config file, please set one in " \
+ "'distributed_object_store_config_file')"
self.backends = {}
self.weighted_backend_ids = []
self.original_weighted_backend_ids = []
self.max_percent_full = {}
self.global_max_percent_full = 0.0
-
random.seed()
-
- self.__parse_distributed_config(config)
-
+ self.__parse_distributed_config(config, config_xml)
self.sleeper = None
if fsmon and ( self.global_max_percent_full or filter( lambda x: x != 0.0, self.max_percent_full.values() ) ):
self.sleeper = Sleeper()
@@ -395,10 +462,14 @@
self.filesystem_monitor_thread.start()
log.info("Filesystem space monitor started")
- def __parse_distributed_config(self, config):
- tree = util.parse_xml(self.distributed_config)
- root = tree.getroot()
- log.debug('Loading backends for distributed object store from %s' % self.distributed_config)
+ def __parse_distributed_config(self, config, config_xml = None):
+ if not config_xml:
+ tree = util.parse_xml(self.distributed_config)
+ root = tree.getroot()
+ log.debug('Loading backends for distributed object store from %s' % self.distributed_config)
+ else:
+ root = config_xml.find('backends')
+ log.debug('Loading backends for distributed object store from %s' % config_xml.get('id'))
self.global_max_percent_full = float(root.get('maxpctfull', 0))
for elem in [ e for e in root if e.tag == 'backend' ]:
id = elem.get('id')
@@ -427,6 +498,11 @@
self.weighted_backend_ids.append(id)
self.original_weighted_backend_ids = self.weighted_backend_ids
+ def shutdown(self):
+ super(DistributedObjectStore, self).shutdown()
+ if self.sleeper is not None:
+ self.sleeper.wake()
+
def __filesystem_monitor(self):
while self.running:
new_weighted_backend_ids = self.original_weighted_backend_ids
@@ -438,17 +514,6 @@
self.weighted_backend_ids = new_weighted_backend_ids
self.sleeper.sleep(120) # Test free space every 2 minutes
- def shutdown(self):
- super(DistributedObjectStore, self).shutdown()
- if self.sleeper is not None:
- self.sleeper.wake()
-
- def exists(self, obj, **kwargs):
- return self.__call_method('exists', obj, False, False, **kwargs)
-
- def file_ready(self, obj, **kwargs):
- return self.__call_method('file_ready', obj, False, False, **kwargs)
-
def create(self, obj, **kwargs):
"""
create() is the only method in which obj.object_store_id may be None
@@ -467,30 +532,6 @@
log.debug("Using preferred backend '%s' for creation of %s %s" % (obj.object_store_id, obj.__class__.__name__, obj.id))
self.backends[obj.object_store_id].create(obj, **kwargs)
- def empty(self, obj, **kwargs):
- return self.__call_method('empty', obj, True, False, **kwargs)
-
- def size(self, obj, **kwargs):
- return self.__call_method('size', obj, 0, False, **kwargs)
-
- def delete(self, obj, **kwargs):
- return self.__call_method('delete', obj, False, False, **kwargs)
-
- def get_data(self, obj, **kwargs):
- return self.__call_method('get_data', obj, ObjectNotFound, True, **kwargs)
-
- def get_filename(self, obj, **kwargs):
- return self.__call_method('get_filename', obj, ObjectNotFound, True, **kwargs)
-
- def update_from_file(self, obj, **kwargs):
- if kwargs.get('create', False):
- self.create(obj, **kwargs)
- kwargs['create'] = False
- return self.__call_method('update_from_file', obj, ObjectNotFound, True, **kwargs)
-
- def get_object_url(self, obj, **kwargs):
- return self.__call_method('get_object_url', obj, None, False, **kwargs)
-
def __call_method(self, method, obj, default, default_is_exception, **kwargs):
object_store_id = self.__get_store_id_for(obj, **kwargs)
if object_store_id is not None:
@@ -519,33 +560,65 @@
return None
-class HierarchicalObjectStore(ObjectStore):
+class HierarchicalObjectStore(NestedObjectStore):
"""
ObjectStore that defers to a list of backends, for getting objects the
first store where the object exists is used, objects are always created
in the first store.
"""
- def __init__(self, backends=[]):
- super(HierarchicalObjectStore, self).__init__()
+ def __init__(self, config, config_xml=None, fsmon=False):
+ super(HierarchicalObjectStore, self).__init__(config, config_xml=config_xml)
+ self.backends = odict()
+ for b in sorted(config_xml.find('backends'), key=lambda b: int(b.get('order'))):
+ self.backends[int(b.get('order'))] = build_object_store_from_config(config, fsmon=fsmon, config_xml=b)
+ def exists(self, obj, **kwargs):
+ """
+ Exists must check all child object stores
+ """
+ for store in self.backends.values():
+ if store.exists(obj, **kwargs):
+ return True
+ return False
-def build_object_store_from_config(config, fsmon=False):
- """ Depending on the configuration setting, invoke the appropriate object store
+ def create(self, obj, **kwargs):
+ """
+ Create will always be called by the primary object_store
+ """
+ self.backends[0].create(obj, **kwargs)
+
+
+def build_object_store_from_config(config, fsmon=False, config_xml=None):
"""
- store = config.object_store
+ Depending on the configuration setting, invoke the appropriate object store
+ """
+
+ if not config_xml and config.object_store_config_file:
+ # This is a top level invocation of build_object_store_from_config, and
+ # we have an object_store_conf.xml -- read the .xml and build
+ # accordingly
+ tree = util.parse_xml(config.object_store_config_file)
+ root = tree.getroot()
+ store = root.get('type')
+ config_xml = root
+ elif config_xml:
+ store = config_xml.get('type')
+ else:
+ store = config.object_store
+
if store == 'disk':
- return DiskObjectStore(config=config)
+ return DiskObjectStore(config=config, config_xml=config_xml)
elif store == 's3' or store == 'swift':
from galaxy.objectstore.s3 import S3ObjectStore
- return S3ObjectStore(config=config)
+ return S3ObjectStore(config=config, config_xml=config_xml)
elif store == 'distributed':
- return DistributedObjectStore(config=config, fsmon=fsmon)
+ return DistributedObjectStore(config=config, fsmon=fsmon, config_xml=config_xml)
elif store == 'hierarchical':
- return HierarchicalObjectStore()
+ return HierarchicalObjectStore(config=config, config_xml=config_xml)
elif store == 'irods':
from galaxy.objectstore.rods import IRODSObjectStore
- return IRODSObjectStore(config=config)
+ return IRODSObjectStore(config=config, config_xml=config_xml)
else:
log.error("Unrecognized object store definition: {0}".format(store))
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/objectstore/s3.py
--- a/lib/galaxy/objectstore/s3.py
+++ b/lib/galaxy/objectstore/s3.py
@@ -33,7 +33,7 @@
cache exists that is used as an intermediate location for files between
Galaxy and S3.
"""
- def __init__(self, config):
+ def __init__(self, config, config_xml=None):
super(S3ObjectStore, self).__init__()
self.config = config
self.staging_path = self.config.file_path
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/security/validate_user_input.py
--- a/lib/galaxy/security/validate_user_input.py
+++ b/lib/galaxy/security/validate_user_input.py
@@ -2,18 +2,27 @@
VALID_PUBLICNAME_RE = re.compile( "^[a-z0-9\-]+$" )
VALID_PUBLICNAME_SUB = re.compile( "[^a-z0-9\-]" )
+# Basic regular expression to check email validity.
+VALID_EMAIL_RE = re.compile( "[^@]+@[^@]+\.[^@]+" )
FILL_CHAR = '-'
def validate_email( trans, email, user=None, check_dup=True ):
+ """
+ Validates the email format, also checks whether the domain is blacklisted in the disposable domains configuration.
+ """
message = ''
if user and user.email == email:
return message
- if len( email ) == 0 or "@" not in email or "." not in email:
- message = "Enter a real email address"
+ if not( VALID_EMAIL_RE.match( email ) ):
+ message = "Please enter your real email address."
elif len( email ) > 255:
- message = "Email address exceeds maximum allowable length"
+ message = "Email address exceeds maximum allowable length."
elif check_dup and trans.sa_session.query( trans.app.model.User ).filter_by( email=email ).first():
- message = "User with that email already exists"
+ message = "User with that email already exists."
+ # If the blacklist is not empty filter out the disposable domains.
+ elif trans.app.config.blacklist_content is not None:
+ if email.split('@')[1] in trans.app.config.blacklist_content:
+ message = "Please enter your permanent email address."
return message
def validate_publicname( trans, publicname, user=None ):
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -11,6 +11,7 @@
import shutil
import sys
import tempfile
+import threading
import traceback
import types
import urllib
@@ -970,6 +971,7 @@
"""
tool_type = 'default'
+ requires_setting_metadata = True
default_tool_action = DefaultToolAction
dict_collection_visible_keys = ( 'id', 'name', 'version', 'description' )
@@ -1333,19 +1335,16 @@
help_header = ""
help_footer = ""
if self.help is not None:
- # Handle tool help image display for tools that are contained in repositories that are in the tool shed or installed into Galaxy.
- # When tool config files use the special string $PATH_TO_IMAGES, the following code will replace that string with the path on disk.
- if self.repository_id and self.help.text.find( '$PATH_TO_IMAGES' ) >= 0:
- if self.app.name == 'galaxy':
- repository = self.sa_session.query( self.app.model.ToolShedRepository ).get( self.app.security.decode_id( self.repository_id ) )
- if repository:
- path_to_images = '/tool_runner/static/images/%s' % self.repository_id
- self.help.text = self.help.text.replace( '$PATH_TO_IMAGES', path_to_images )
- elif self.app.name == 'tool_shed':
- repository = self.sa_session.query( self.app.model.Repository ).get( self.app.security.decode_id( self.repository_id ) )
- if repository:
- path_to_images = '/repository/static/images/%s' % self.repository_id
- self.help.text = self.help.text.replace( '$PATH_TO_IMAGES', path_to_images )
+ if self.repository_id and self.help.text.find( '.. image:: ' ) >= 0:
+ # Handle tool help image display for tools that are contained in repositories in the tool shed or installed into Galaxy.
+ lock = threading.Lock()
+ lock.acquire( True )
+ try:
+ self.help.text = shed_util_common.set_image_paths( self.app, self.repository_id, self.help.text )
+ except Exception, e:
+ log.exception( "Exception in parse_help, so images may not be properly displayed:\n%s" % str( e ) )
+ finally:
+ lock.release()
help_pages = self.help.findall( "page" )
help_header = self.help.text
try:
@@ -3136,6 +3135,8 @@
dataset.
"""
tool_type = 'set_metadata'
+ requires_setting_metadata = False
+
def exec_after_process( self, app, inp_data, out_data, param_dict, job = None ):
for name, dataset in inp_data.iteritems():
external_metadata = JobExternalOutputMetadataWrapper( job )
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -43,6 +43,9 @@
except KeyError:
return default
+ def get_tables( self ):
+ return self.data_tables
+
def load_from_config_file( self, config_filename, tool_data_path, from_shed_config=False ):
"""
This method is called under 3 conditions:
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/tools/parameters/output.py
--- a/lib/galaxy/tools/parameters/output.py
+++ b/lib/galaxy/tools/parameters/output.py
@@ -65,7 +65,7 @@
def is_case( self, output_dataset, other_values ):
ref = self.get_ref( output_dataset, other_values )
return bool( str( ref ) == self.value )
-
+
class DatatypeIsInstanceToolOutputActionConditionalWhen( ToolOutputActionConditionalWhen ):
tag = "when datatype_isinstance"
def __init__( self, parent, config_elem, value ):
@@ -215,11 +215,12 @@
self.offset = elem.get( 'offset', -1 )
self.offset = int( self.offset )
else:
+ self.options = []
self.missing_tool_data_table_name = self.name
def get_value( self, other_values ):
- try:
+ if self.options:
options = self.options
- except:
+ else:
options = []
for filter in self.filters:
options = filter.filter_options( options, other_values )
@@ -248,7 +249,7 @@
def __init__( self, parent, elem ):
super( FormatToolOutputAction, self ).__init__( parent, elem )
self.default = elem.get( 'default', None )
-
+
def apply_action( self, output_dataset, other_values ):
value = self.option.get_value( other_values )
if value is None and self.default is not None:
@@ -431,7 +432,7 @@
value = str( getattr( ref.metadata, self.name ) )
rval = []
for fields in options:
- if self.keep == ( self.compare( fields[self.column], value ) ):
+ if self.keep == ( self.compare( fields[self.column], value ) ):
rval.append( fields )
return rval
@@ -452,7 +453,7 @@
value = self.cast( value )
except:
value = False #unable to cast or access value; treat as false
- if self.keep == bool( value ):
+ if self.keep == bool( value ):
rval.append( fields )
return rval
@@ -480,7 +481,7 @@
option_types = {}
for option_type in [ NullToolOutputActionOption, FromFileToolOutputActionOption, FromParamToolOutputActionOption, FromDataTableOutputActionOption ]:
option_types[ option_type.tag ] = option_type
-
+
filter_types = {}
for filter_type in [ ParamValueToolOutputActionOptionFilter, InsertColumnToolOutputActionOptionFilter, MultipleSplitterFilter, ColumnStripFilter, MetadataValueFilter, BooleanFilter, StringFunctionFilter, ColumnReplaceFilter ]:
filter_types[ filter_type.tag ] = filter_type
@@ -524,5 +525,5 @@
def compare_re_search( value1, value2 ):
#checks pattern=value2 in value1
return bool( re.search( value2, value1 ) )
-
+
compare_types = { 'eq':compare_eq, 'neq':compare_neq, 'gt':compare_gt, 'gte':compare_gte, 'lt':compare_lt, 'lte':compare_lte, 'in':compare_in, 'startswith':compare_startswith, 'endswith':compare_endswith, "re_search":compare_re_search }
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/visualization/data_providers/genome.py
--- a/lib/galaxy/visualization/data_providers/genome.py
+++ b/lib/galaxy/visualization/data_providers/genome.py
@@ -690,33 +690,38 @@
allele_counts = [ 0 for i in range ( alt.count( ',' ) + 1 ) ]
sample_gts = []
- # Process and pack samples' genotype and count alleles across samples.
- alleles_seen = {}
- has_alleles = False
+ if samples_data:
+ # Process and pack samples' genotype and count alleles across samples.
+ alleles_seen = {}
+ has_alleles = False
- for i, sample in enumerate( samples_data ):
- # Parse and count alleles.
- genotype = sample.split( ':' )[ 0 ]
- has_alleles = False
- alleles_seen.clear()
- for allele in genotype_re.split( genotype ):
- try:
- # This may throw a ValueError if allele is missing.
- allele = int( allele )
+ for i, sample in enumerate( samples_data ):
+ # Parse and count alleles.
+ genotype = sample.split( ':' )[ 0 ]
+ has_alleles = False
+ alleles_seen.clear()
+ for allele in genotype_re.split( genotype ):
+ try:
+ # This may throw a ValueError if allele is missing.
+ allele = int( allele )
- # Only count allele if it hasn't been seen yet.
- if allele != 0 and allele not in alleles_seen:
- allele_counts[ allele - 1 ] += 1
- alleles_seen[ allele ] = True
- has_alleles = True
- except ValueError:
- pass
+ # Only count allele if it hasn't been seen yet.
+ if allele != 0 and allele not in alleles_seen:
+ allele_counts[ allele - 1 ] += 1
+ alleles_seen[ allele ] = True
+ has_alleles = True
+ except ValueError:
+ pass
- # If no alleles, use empty string as proxy.
- if not has_alleles:
- genotype = ''
+ # If no alleles, use empty string as proxy.
+ if not has_alleles:
+ genotype = ''
- sample_gts.append( genotype )
+ sample_gts.append( genotype )
+ else:
+ # No samples, so set allele count and sample genotype manually.
+ allele_counts = [ 1 ]
+ sample_gts = [ '1/1' ]
# Add locus data.
locus_data = [
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -24,6 +24,7 @@
from galaxy import model
from galaxy import security
from galaxy import util
+from galaxy import objectstore
from galaxy.web import error, url_for
from galaxy.web.form_builder import AddressField, CheckboxField, SelectField, TextArea, TextField
@@ -583,7 +584,8 @@
"""
#precondition: the user's access to this hda has already been checked
#TODO:?? postcondition: all ids are encoded (is this really what we want at this level?)
- hda_dict = hda.to_dict( view='element' )
+ expose_dataset_path = trans.user_is_admin() or trans.app.config.expose_dataset_path
+ hda_dict = hda.to_dict( view='element', expose_dataset_path=expose_dataset_path )
hda_dict[ 'api_type' ] = "file"
# Add additional attributes that depend on trans can hence must be added here rather than at the model level.
@@ -599,8 +601,11 @@
#TODO: to_dict should really go AFTER this - only summary data
return trans.security.encode_dict_ids( hda_dict )
- if trans.user_is_admin() or trans.app.config.expose_dataset_path:
- hda_dict[ 'file_name' ] = hda.file_name
+ if expose_dataset_path:
+ try:
+ hda_dict[ 'file_name' ] = hda.file_name
+ except objectstore.ObjectNotFound, onf:
+ log.exception( 'objectstore.ObjectNotFound, HDA %s: %s', hda.id, onf )
hda_dict[ 'download_url' ] = url_for( 'history_contents_display',
history_id = trans.security.encode_id( hda.history.id ),
@@ -1077,7 +1082,6 @@
return {
"dataset_id": trans.security.decode_id( dataset_dict['id'] ),
"hda_ldda": dataset_dict.get('hda_ldda', 'hda'),
- "name": track_dict['name'],
"track_type": track_dict['track_type'],
"prefs": track_dict['prefs'],
"mode": track_dict['mode'],
@@ -1096,7 +1100,6 @@
drawable = unpack_collection( drawable_json )
unpacked_drawables.append( drawable )
return {
- "name": collection_json.get( 'name', '' ),
"obj_type": collection_json[ 'obj_type' ],
"drawables": unpacked_drawables,
"prefs": collection_json.get( 'prefs' , [] ),
@@ -1187,7 +1190,6 @@
return {
"track_type": dataset.datatype.track_type,
"dataset": trans.security.encode_dict_ids( dataset.to_dict() ),
- "name": track_dict['name'],
"prefs": prefs,
"mode": track_dict.get( 'mode', 'Auto' ),
"filters": track_dict.get( 'filters', { 'filters' : track_data_provider.get_filters() } ),
@@ -1203,7 +1205,6 @@
else:
drawables.append( pack_collection( drawable_dict ) )
return {
- 'name': collection_dict.get( 'name', 'dummy' ),
'obj_type': collection_dict[ 'obj_type' ],
'drawables': drawables,
'prefs': collection_dict.get( 'prefs', [] ),
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/web/form_builder.py
--- a/lib/galaxy/web/form_builder.py
+++ b/lib/galaxy/web/form_builder.py
@@ -756,13 +756,6 @@
refresh_on_change=refresh_on_change,
refresh_on_change_values=refresh_on_change_values,
size=size )
- if display is None and initial_value == 'none':
- # Only insert an initial "Select one" option if we are not displaying check boxes
- # or radio buttons and we have not received an initial_value other than 'none'.
- if selected_value == initial_value:
- select_field.add_option( 'Select one', initial_value, selected=True )
- else:
- select_field.add_option( 'Select one', initial_value )
for obj in objs:
if label_attr == 'self':
# Each obj is a string
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/web/framework/base.py
--- a/lib/galaxy/web/framework/base.py
+++ b/lib/galaxy/web/framework/base.py
@@ -379,7 +379,7 @@
"""
Send an HTTP redirect response to (target `url`)
"""
- raise httpexceptions.HTTPFound( url, headers=self.wsgi_headeritems() )
+ raise httpexceptions.HTTPFound( url.encode('utf-8'), headers=self.wsgi_headeritems() )
def wsgi_headeritems( self ):
"""
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/web/framework/helpers/grids.py
--- a/lib/galaxy/web/framework/helpers/grids.py
+++ b/lib/galaxy/web/framework/helpers/grids.py
@@ -198,7 +198,7 @@
if page_num == 0:
# Show all rows in page.
total_num_rows = query.count()
- # persistant page='all'
+ # persistent page='all'
page_num = 1
#page_num = 'all'
#extra_url_args['page'] = page_num
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/web/framework/middleware/request_id.py
--- a/lib/galaxy/web/framework/middleware/request_id.py
+++ b/lib/galaxy/web/framework/middleware/request_id.py
@@ -8,5 +8,5 @@
def __init__( self, app, global_conf=None ):
self.app = app
def __call__( self, environ, start_response ):
- environ['request_id'] = uuid.uuid1().hex
- return self.app( environ, start_response )
\ No newline at end of file
+ environ['request_id'] = uuid.uuid1().hex
+ return self.app( environ, start_response )
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/webapps/galaxy/api/datatypes.py
--- /dev/null
+++ b/lib/galaxy/webapps/galaxy/api/datatypes.py
@@ -0,0 +1,25 @@
+"""
+API operations allowing clients to determine datatype supported by Galaxy.
+"""
+
+from galaxy import web
+from galaxy.web.base.controller import BaseAPIController
+from galaxy.datatypes.registry import Registry
+
+import logging
+log = logging.getLogger( __name__ )
+
+class DatatypesController( BaseAPIController ):
+ @web.expose_api
+ def index( self, trans, **kwd ):
+ """
+ GET /api/datatypes
+ Return an object containing datatypes.
+ """
+ try:
+ return trans.app.datatypes_registry.upload_file_formats
+
+ except Exception, exception:
+ log.error( 'could not get datatypes: %s', str( exception ), exc_info=True )
+ trans.response.status = 500
+ return { 'error': str( exception ) }
\ No newline at end of file
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/webapps/galaxy/api/folders.py
--- a/lib/galaxy/webapps/galaxy/api/folders.py
+++ b/lib/galaxy/webapps/galaxy/api/folders.py
@@ -52,7 +52,7 @@
o payload's relevant params:
- folder_id: This is the parent folder's id (required)
"""
- log.debug( "FoldersController.create: enter" )
+# log.debug( "FoldersController.create: enter" )
# TODO: Create a single point of exit if possible. For now we only
# exit at the end and on exceptions.
if 'folder_id' not in payload:
@@ -90,7 +90,7 @@
name = v.name,
url = url_for( 'folder', id=encoded_id ) ) )
else:
- log.debug( "Error creating folder; setting output and status" )
+# log.debug( "Error creating folder; setting output and status" )
trans.response.status = status
rval = output
return rval
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/webapps/galaxy/api/history_contents.py
--- a/lib/galaxy/webapps/galaxy/api/history_contents.py
+++ b/lib/galaxy/webapps/galaxy/api/history_contents.py
@@ -214,69 +214,6 @@
trans.sa_session.flush()
return hda.to_dict()
- # copy from upload
- if source == 'upload':
-
- # get upload specific features
- dbkey = payload.get('dbkey', None)
- extension = payload.get('extension', None)
- space_to_tabs = payload.get('space_to_tabs', False)
-
- # check for filename
- if content.filename is None:
- trans.response.status = 400
- return "history_contents:create() : The contents parameter needs to contain the uploaded file content."
-
- # create a dataset
- dataset = trans.app.model.Dataset()
- trans.sa_session.add(dataset)
- trans.sa_session.flush()
-
- # get file destination
- file_destination = dataset.get_file_name()
-
- # check if the directory exists
- dn = os.path.dirname(file_destination)
- if not os.path.exists(dn):
- os.makedirs(dn)
-
- # get file and directory names
- fn = os.path.basename(content.filename)
-
- # save file locally
- open(file_destination, 'wb').write(content.file.read())
-
- # log
- log.info ('The file "' + fn + '" was uploaded successfully.')
-
- # replace separation with tabs
- if space_to_tabs:
- log.info ('Replacing spaces with tabs.')
- sniff.convert_newlines_sep2tabs(file_destination)
-
- # guess extension
- if extension is None:
- log.info ('Guessing extension.')
- extension = sniff.guess_ext(file_destination)
-
- # create hda
- hda = trans.app.model.HistoryDatasetAssociation(dataset = dataset, name = content.filename,
- extension = extension, dbkey = dbkey, history = history, sa_session = trans.sa_session)
-
- # add status ok
- hda.state = hda.states.OK
-
- # add dataset to history
- history.add_dataset(hda, genome_build = dbkey)
- permissions = trans.app.security_agent.history_get_default_permissions( history )
- trans.app.security_agent.set_all_dataset_permissions( hda.dataset, permissions )
-
- # add to session
- trans.sa_session.add(hda)
- trans.sa_session.flush()
-
- # get name
- return hda.to_dict()
else:
# other options
trans.response.status = 501
@@ -328,6 +265,75 @@
return changed
+ @web.expose_api
+ def delete( self, trans, history_id, id, **kwd ):
+ """
+ delete( self, trans, history_id, id, **kwd )
+ * DELETE /api/histories/{history_id}/contents/{id}
+ delete the HDA with the given ``id``
+ .. note:: Currently does not stop any active jobs for which this dataset is an output.
+
+ :type id: str
+ :param id: the encoded id of the history to delete
+ :type kwd: dict
+ :param kwd: (optional) dictionary structure containing:
+
+ * payload: a dictionary itself containing:
+ * purge: if True, purge the HDA
+
+ :rtype: dict
+ :returns: an error object if an error occurred or a dictionary containing:
+ * id: the encoded id of the history,
+ * deleted: if the history was marked as deleted,
+ * purged: if the history was purged
+ """
+ # a request body is optional here
+ purge = False
+ if kwd.get( 'payload', None ):
+ purge = string_as_bool( kwd['payload'].get( 'purge', False ) )
+
+ rval = { 'id' : id }
+ try:
+ hda = self.get_dataset( trans, id,
+ check_ownership=True, check_accessible=True, check_state=True )
+ hda.deleted = True
+
+ if purge:
+ if not trans.app.config.allow_user_dataset_purge:
+ raise HTTPForbidden( detail='This instance does not allow user dataset purging' )
+
+ hda.purged = True
+ trans.sa_session.add( hda )
+ trans.sa_session.flush()
+
+ if hda.dataset.user_can_purge:
+ try:
+ hda.dataset.full_delete()
+ trans.sa_session.add( hda.dataset )
+ except:
+ pass
+ # flush now to preserve deleted state in case of later interruption
+ trans.sa_session.flush()
+
+ rval[ 'purged' ] = True
+
+ trans.sa_session.flush()
+ rval[ 'deleted' ] = True
+
+ except HTTPInternalServerError, http_server_err:
+ log.exception( 'HDA API, delete: uncaught HTTPInternalServerError: %s, %s\n%s',
+ id, str( kwd ), str( http_server_err ) )
+ raise
+ except HTTPException, http_exc:
+ raise
+ except Exception, exc:
+ log.exception( 'HDA API, delete: uncaught exception: %s, %s\n%s',
+ id, str( kwd ), str( exc ) )
+ trans.response.status = 500
+ rval.update({ 'error': str( exc ) })
+
+ return rval
+
def _validate_and_parse_update_payload( self, payload ):
"""
Validate and parse incomming data payload for an HDA.
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/webapps/galaxy/api/libraries.py
--- a/lib/galaxy/webapps/galaxy/api/libraries.py
+++ b/lib/galaxy/webapps/galaxy/api/libraries.py
@@ -28,7 +28,7 @@
:returns: list of dictionaries containing library information
.. seealso:: :attr:`galaxy.model.Library.dict_collection_visible_keys`
"""
- log.debug( "LibrariesController.index: enter" )
+# log.debug( "LibrariesController.index: enter" )
query = trans.sa_session.query( trans.app.model.Library )
deleted = util.string_as_bool( deleted )
if deleted:
@@ -73,7 +73,7 @@
:returns: detailed library information
.. seealso:: :attr:`galaxy.model.Library.dict_element_visible_keys`
"""
- log.debug( "LibraryContentsController.show: enter" )
+# log.debug( "LibraryContentsController.show: enter" )
library_id = id
deleted = util.string_as_bool( deleted )
try:
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/webapps/galaxy/api/library_contents.py
--- a/lib/galaxy/webapps/galaxy/api/library_contents.py
+++ b/lib/galaxy/webapps/galaxy/api/library_contents.py
@@ -53,8 +53,8 @@
can_access = trans.app.security_agent.can_access_dataset(
current_user_roles, ld.library_dataset_dataset_association.dataset )
if (admin or can_access) and not ld.deleted:
- log.debug( "type(folder): %s" % type( folder ) )
- log.debug( "type(api_path): %s; folder.api_path: %s" % ( type(folder.api_path), folder.api_path ) )
+ #log.debug( "type(folder): %s" % type( folder ) )
+ #log.debug( "type(api_path): %s; folder.api_path: %s" % ( type(folder.api_path), folder.api_path ) )
#log.debug( "attributes of folder: %s" % str(dir(folder)) )
ld.api_path = folder.api_path + '/' + ld.name
ld.api_type = 'file'
@@ -72,13 +72,13 @@
if not library or not ( trans.user_is_admin() or trans.app.security_agent.can_access_library( current_user_roles, library ) ):
trans.response.status = 400
return "Invalid library id ( %s ) specified." % str( library_id )
- log.debug( "Root folder type: %s" % type( library.root_folder ) )
+ #log.debug( "Root folder type: %s" % type( library.root_folder ) )
encoded_id = 'F' + trans.security.encode_id( library.root_folder.id )
rval.append( dict( id = encoded_id,
type = 'folder',
name = '/',
url = url_for( 'library_content', library_id=library_id, id=encoded_id ) ) )
- log.debug( "Root folder attributes: %s" % str(dir(library.root_folder)) )
+ #log.debug( "Root folder attributes: %s" % str(dir(library.root_folder)) )
library.root_folder.api_path = ''
for content in traverse( library.root_folder ):
encoded_id = trans.security.encode_id( content.id )
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
--- a/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
+++ b/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
@@ -10,6 +10,7 @@
from tool_shed.galaxy_install import repository_util
from tool_shed.util import common_util
from tool_shed.util import encoding_util
+from tool_shed.util import workflow_util
import tool_shed.util.shed_util_common as suc
log = logging.getLogger( __name__ )
@@ -33,6 +34,102 @@
"""RESTful controller for interactions with tool shed repositories."""
@web.expose_api
+ def exported_workflows( self, trans, id, **kwd ):
+ """
+ GET /api/tool_shed_repositories/{encoded_tool_shed_repository_id}/exported_workflows
+
+ Display a list of dictionaries containing information about this tool shed repository's exported workflows.
+
+ :param id: the encoded id of the ToolShedRepository object
+ """
+ # Example URL: http://localhost:8763/api/tool_shed_repositories/f2db41e1fa331b3e/exported_…
+ # Since exported workflows are dictionaries with very few attributes that differentiate them from each other, we'll build the
+ # list based on the following dictionary of those few attributes.
+ exported_workflows = []
+ repository = suc.get_tool_shed_repository_by_id( trans, id )
+ metadata = repository.metadata
+ if metadata:
+ exported_workflow_tups = metadata.get( 'workflows', [] )
+ else:
+ exported_workflow_tups = []
+ for index, exported_workflow_tup in enumerate( exported_workflow_tups ):
+ # The exported_workflow_tup looks like ( relative_path, exported_workflow_dict ), where the value of relative_path is the location
+ # on disk (relative to the root of the installed repository) where the exported_workflow_dict file (.ga file) is located.
+ exported_workflow_dict = exported_workflow_tup[ 1 ]
+ annotation = exported_workflow_dict.get( 'annotation', '' )
+ format_version = exported_workflow_dict.get( 'format-version', '' )
+ workflow_name = exported_workflow_dict.get( 'name', '' )
+ # Since we don't have an in-memory object with an id, we'll identify the exported workflow via it's location (i.e., index) in the list.
+ display_dict = dict( index=index, annotation=annotation, format_version=format_version, workflow_name=workflow_name )
+ exported_workflows.append( display_dict )
+ return exported_workflows
+
+ @web.expose_api
+ def import_workflow( self, trans, payload, **kwd ):
+ """
+ POST /api/tool_shed_repositories/import_workflow
+
+ Import the specified exported workflow contained in the specified installed tool shed repository into Galaxy.
+
+ :param key: the API key of the Galaxy user with which the imported workflow will be associated.
+ :param id: the encoded id of the ToolShedRepository object
+
+ The following parameters are included in the payload.
+ :param index: the index location of the workflow tuple in the list of exported workflows stored in the metadata for the specified repository
+ """
+ api_key = kwd.get( 'key', None )
+ if api_key is None:
+ raise HTTPBadRequest( detail="Missing required parameter 'key' whose value is the API key for the Galaxy user importing the specified workflow." )
+ tool_shed_repository_id = kwd.get( 'id', '' )
+ if not tool_shed_repository_id:
+ raise HTTPBadRequest( detail="Missing required parameter 'id'." )
+ index = payload.get( 'index', None )
+ if index is None:
+ raise HTTPBadRequest( detail="Missing required parameter 'index'." )
+ repository = suc.get_tool_shed_repository_by_id( trans, tool_shed_repository_id )
+ exported_workflows = json.from_json_string( self.exported_workflows( trans, tool_shed_repository_id ) )
+ # Since we don't have an in-memory object with an id, we'll identify the exported workflow via it's location (i.e., index) in the list.
+ exported_workflow = exported_workflows[ int( index ) ]
+ workflow_name = exported_workflow[ 'workflow_name' ]
+ workflow, status, message = workflow_util.import_workflow( trans, repository, workflow_name )
+ if status == 'error':
+ log.error( message, exc_info=True )
+ trans.response.status = 500
+ return message
+ else:
+ return workflow.to_dict( view='element' )
+
+ @web.expose_api
+ def import_workflows( self, trans, **kwd ):
+ """
+ POST /api/tool_shed_repositories/import_workflow
+
+ Import all of the exported workflows contained in the specified installed tool shed repository into Galaxy.
+
+ :param key: the API key of the Galaxy user with which the imported workflows will be associated.
+ :param id: the encoded id of the ToolShedRepository object
+ """
+ api_key = kwd.get( 'key', None )
+ if api_key is None:
+ raise HTTPBadRequest( detail="Missing required parameter 'key' whose value is the API key for the Galaxy user importing the specified workflow." )
+ tool_shed_repository_id = kwd.get( 'id', '' )
+ if not tool_shed_repository_id:
+ raise HTTPBadRequest( detail="Missing required parameter 'id'." )
+ repository = suc.get_tool_shed_repository_by_id( trans, tool_shed_repository_id )
+ exported_workflows = json.from_json_string( self.exported_workflows( trans, tool_shed_repository_id ) )
+ imported_workflow_dicts = []
+ for exported_workflow_dict in exported_workflows:
+ workflow_name = exported_workflow_dict[ 'workflow_name' ]
+ workflow, status, message = workflow_util.import_workflow( trans, repository, workflow_name )
+ if status == 'error':
+ log.error( message, exc_info=True )
+ trans.response.status = 500
+ return message
+ else:
+ imported_workflow_dicts.append( workflow.to_dict( view='element' ) )
+ return imported_workflow_dicts
+
+ @web.expose_api
def index( self, trans, **kwd ):
"""
GET /api/tool_shed_repositories
@@ -58,28 +155,6 @@
return message
@web.expose_api
- def show( self, trans, id, **kwd ):
- """
- GET /api/tool_shed_repositories/{encoded_tool_shed_repsository_id}
- Display a dictionary containing information about a specified tool_shed_repository.
-
- :param id: the encoded id of the ToolShedRepository object
- """
- # Example URL: http://localhost:8763/api/tool_shed_repositories/df7a1f0c02a5b08e
- try:
- tool_shed_repository = suc.get_tool_shed_repository_by_id( trans, id )
- tool_shed_repository_dict = tool_shed_repository.as_dict( value_mapper=default_tool_shed_repository_value_mapper( trans, tool_shed_repository ) )
- tool_shed_repository_dict[ 'url' ] = web.url_for( controller='tool_shed_repositories',
- action='show',
- id=trans.security.encode_id( tool_shed_repository.id ) )
- return tool_shed_repository_dict
- except Exception, e:
- message = "Error in tool_shed_repositories API in index: " + str( e )
- log.error( message, exc_info=True )
- trans.response.status = 500
- return message
-
- @web.expose_api
def install_repository_revision( self, trans, payload, **kwd ):
"""
POST /api/tool_shed_repositories/install_repository_revision
@@ -211,7 +286,6 @@
installation_dict = dict( install_repository_dependencies=install_repository_dependencies,
new_tool_panel_section=new_tool_panel_section,
no_changes_checked=False,
- reinstalling=False,
repo_info_dicts=repo_info_dicts,
tool_panel_section=tool_panel_section,
tool_path=tool_path,
@@ -412,3 +486,25 @@
tool_shed_repositories.append( repository_dict )
# Display the list of repaired repositories.
return tool_shed_repositories
+
+ @web.expose_api
+ def show( self, trans, id, **kwd ):
+ """
+ GET /api/tool_shed_repositories/{encoded_tool_shed_repsository_id}
+ Display a dictionary containing information about a specified tool_shed_repository.
+
+ :param id: the encoded id of the ToolShedRepository object
+ """
+ # Example URL: http://localhost:8763/api/tool_shed_repositories/df7a1f0c02a5b08e
+ try:
+ tool_shed_repository = suc.get_tool_shed_repository_by_id( trans, id )
+ tool_shed_repository_dict = tool_shed_repository.as_dict( value_mapper=default_tool_shed_repository_value_mapper( trans, tool_shed_repository ) )
+ tool_shed_repository_dict[ 'url' ] = web.url_for( controller='tool_shed_repositories',
+ action='show',
+ id=trans.security.encode_id( tool_shed_repository.id ) )
+ return tool_shed_repository_dict
+ except Exception, e:
+ message = "Error in tool_shed_repositories API in index: " + str( e )
+ log.error( message, exc_info=True )
+ trans.response.status = 500
+ return message
diff -r 65158f203ac7b0bd540778b2771bde08557f832a -r acd70e8acbe6cea33d02366ad5dfa2a7e5124e1e lib/galaxy/webapps/galaxy/buildapp.py
--- a/lib/galaxy/webapps/galaxy/buildapp.py
+++ b/lib/galaxy/webapps/galaxy/buildapp.py
@@ -46,11 +46,17 @@
atexit.register( app.shutdown )
# Create the universe WSGI application
webapp = GalaxyWebApplication( app, session_cookie='galaxysession', name='galaxy' )
- # The following route will handle displaying tool shelp images for tools contained in repositories installed from the tool shed.
- webapp.add_route( '/tool_runner/static/images/:repository_id/:image_file', controller='tool_runner', action='display_tool_help_image_in_repository', repository_id=None, image_file=None )
+ # Handle displaying tool help images and README file images contained in repositories installed from the tool shed.
+ webapp.add_route( '/admin_toolshed/static/images/:repository_id/:image_file',
+ controller='admin_toolshed',
+ action='display_image_in_repository',
+ repository_id=None,
+ image_file=None )
webapp.add_ui_controllers( 'galaxy.webapps.galaxy.controllers', app )
# Force /history to go to /root/history -- needed since the tests assume this
webapp.add_route( '/history', controller='root', action='history' )
+ # Force /activate to go to the controller
+ webapp.add_route( '/activate', controller='user', action='activate' )
# These two routes handle our simple needs at the moment
webapp.add_route( '/async/:tool_id/:data_id/:data_secret', controller='async', action='index', tool_id=None, data_id=None, data_secret=None )
webapp.add_route( '/:controller/:action', action='index' )
@@ -156,6 +162,7 @@
webapp.mapper.resource( 'workflow', 'workflows', path_prefix='/api' )
webapp.mapper.resource_with_deleted( 'history', 'histories', path_prefix='/api' )
webapp.mapper.resource( 'configuration', 'configuration', path_prefix='/api' )
+ webapp.mapper.resource( 'datatype', 'datatypes', path_prefix='/api' )
#webapp.mapper.connect( 'run_workflow', '/api/workflow/{workflow_id}/library/{library_id}', controller='workflows', action='run', workflow_id=None, library_id=None, conditions=dict(method=["GET"]) )
webapp.mapper.resource( 'search', 'search', path_prefix='/api' )
@@ -176,7 +183,10 @@
# Galaxy API for tool shed features.
webapp.mapper.resource( 'tool_shed_repository',
'tool_shed_repositories',
- member={ 'repair_repository_revision' : 'POST' },
+ member={ 'repair_repository_revision' : 'POST',
+ 'exported_workflows' : 'GET',
+ 'import_workflow' : 'POST',
+ 'import_workflows' : 'POST' },
controller='tool_shed_repositories',
name_prefix='tool_shed_repository_',
path_prefix='/api',
This diff is so big that we needed to truncate the remainder.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: Dave Bouvier: Tool dependencies with errors are now individually uninstalled.
by commits-noreply@bitbucket.org 15 Oct '13
by commits-noreply@bitbucket.org 15 Oct '13
15 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/b1f3de87ea99/
Changeset: b1f3de87ea99
User: Dave Bouvier
Date: 2013-10-15 19:47:19
Summary: Tool dependencies with errors are now individually uninstalled.
Affected #: 1 file
diff -r 917be63dc031533f9fbd0db50ec3d8d55fd66e9f -r b1f3de87ea9920cc5daaa843b613a73e316fe043 test/install_and_test_tool_shed_repositories/functional_tests.py
--- a/test/install_and_test_tool_shed_repositories/functional_tests.py
+++ b/test/install_and_test_tool_shed_repositories/functional_tests.py
@@ -1066,10 +1066,7 @@
# repository using Twill. If tool dependencies failed installation, select to uninstall instead of deavctivate,
# to make way for the next attempt. Otherwise, default to the value determined by the environment variable
# GALAXY_INSTALL_TEST_KEEP_TOOL_DEPENDENCIES.
- if failed_tool_dependencies:
- execute_uninstall_method( app, deactivate_only=False )
- else:
- execute_uninstall_method( app, deactivate_only=deactivate_only )
+ execute_uninstall_method( app, deactivate_only=deactivate_only )
# Set the test_toolbox.toolbox module-level variable to the new app.toolbox.
test_toolbox.toolbox = app.toolbox
repositories_failed_install.append( dict( name=name, owner=owner, changeset_revision=changeset_revision ) )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Code cleanup in the tool shed's install_util.
by commits-noreply@bitbucket.org 15 Oct '13
by commits-noreply@bitbucket.org 15 Oct '13
15 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/917be63dc031/
Changeset: 917be63dc031
User: greg
Date: 2013-10-15 18:09:33
Summary: Code cleanup in the tool shed's install_util.
Affected #: 1 file
diff -r d3f602da35338f2a5208cdcadd3d36bdc9eb00f2 -r 917be63dc031533f9fbd0db50ec3d8d55fd66e9f lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
--- a/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
+++ b/lib/tool_shed/galaxy_install/tool_dependencies/install_util.py
@@ -14,8 +14,8 @@
from tool_shed.util import xml_util
from tool_shed.galaxy_install.tool_dependencies import td_common_util
from galaxy.model.orm import and_
-from galaxy.web import url_for
from galaxy.util import asbool
+from galaxy.util import listify
log = logging.getLogger( __name__ )
@@ -673,20 +673,6 @@
else:
install_and_build_package_via_fabric( app, tool_dependency, actions_dict )
-def listify( item ):
- """
- Make a single item a single item list, or return a list if passed a
- list. Passing a None returns an empty list.
- """
- if not item:
- return []
- elif isinstance( item, list ):
- return item
- elif isinstance( item, basestring ) and item.count( ',' ):
- return item.split( ',' )
- else:
- return [ item ]
-
def parse_env_shell_entry( action, name, value, line ):
new_value = value
var_name = '$%s' % name
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: natefoo: Include job_runner_external_id in the dataset error email.
by commits-noreply@bitbucket.org 15 Oct '13
by commits-noreply@bitbucket.org 15 Oct '13
15 Oct '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/d3f602da3533/
Changeset: d3f602da3533
User: natefoo
Date: 2013-10-15 17:48:27
Summary: Include job_runner_external_id in the dataset error email.
Affected #: 1 file
diff -r 1f9ecea922615ec3c231aa196a8b1785e7b148a4 -r d3f602da35338f2a5208cdcadd3d36bdc9eb00f2 lib/galaxy/webapps/galaxy/controllers/dataset.py
--- a/lib/galaxy/webapps/galaxy/controllers/dataset.py
+++ b/lib/galaxy/webapps/galaxy/controllers/dataset.py
@@ -64,6 +64,7 @@
-----------------------------------------------------------------------------
job id: ${job_id}
tool id: ${job_tool_id}
+job pid or drm id: ${job_runner_external_id}
-----------------------------------------------------------------------------
job command line:
${job_command_line}
@@ -231,6 +232,7 @@
history_view_link=history_view_link,
job_id=job.id,
job_tool_id=job.tool_id,
+ job_runner_external_id=job.job_runner_external_id,
job_command_line=job.command_line,
job_stderr=util.unicodify( job.stderr ),
job_stdout=util.unicodify( job.stdout ),
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0