Further, it seems that it doesn't manage to get hold of the file specified in the input directory as I can see from the output:

http://barium-rbh/csiro/api/histories/964b37715ec9bd22/contents/2faba7054d92b2df

{
    "data_type": "html", 
    "deleted": false, 
    "download_url": "/csiro/datasets/2faba7054d92b2df/display?to_ext=html", 
    "file_name": "/home/galaxy/galaxy-dist/database/files/000/dataset_137.dat", 
    "file_size": 194, 
    "genome_build": "?", 
    "id": "2faba7054d92b2df", 
    "metadata_data_lines": null, 
    "metadata_dbkey": "?", 
    "misc_blurb": "error", 
    "misc_info": "Wed May  8 15:07:27 2013\nbashScript is: /home/galaxy/galaxy-dist/tools/visualization/extractSlice-wrapper.sh\ninput_image is: None\ncat: None: No such file or directory\nFailed reading file /tmp/tmp.9LWrz6SaLy.nii.gz\nitk::ERROR: PNGImageIO(0x1abdcf0): PNGIma", 
    "model_class": "HistoryDatasetAssociation", 
    "name": "Extract 2D slice on None", 
    "state": "error", 
    "visible": true
}

input_image is: None

Does anybody know why the file may not be getting read? It is being copied from the specified input directory to the output directory.

I have set :
allow_library_path_paste = True

and added my user to:
admin_users

Thanks for any help

Neil



From: Burdett, Neil (ICT Centre, Herston - RBWH)
Sent: Wednesday, May 08, 2013 2:46 PM
To: galaxy-dev@lists.bx.psu.edu
Subject: Getting example_watch_folder.py to work...

Hi,
     I'm trying to get the example_watch_folder.py to run but it seems to fail, and I'm not sure why?

I run:

 ./example_watch_folder.py 64f3209856a3cf4f2d034a1ad5bf851c http://barium-rbh/csiro/api/ /home/galaxy/galaxy-drop/input /home/galaxy/galaxy-drop/output "My API Import" f597429621d6eb2b

I got the workflow Id from :

http://barium-rbh/csiro/api/workflows

which gave me:
[
    {
        "id": "f597429621d6eb2b", 
        "name": "extract", 
        "url": "/csiro/api/workflows/f597429621d6eb2b"
    }, 
    {
        "id": "f2db41e1fa331b3e", 
        "name": "FULL CTE", 
        "url": "/csiro/api/workflows/f2db41e1fa331b3e"
    }
]

The output I get from the command line is:
{'outputs': ['ba0fa2aed4052bce'], 'history': 'ba03619785539f8c'}

The files I put into /home/galaxy/galaxy-drop/input get copied to /home/galaxy/galaxy-drop/output

But nothing else happens.

If I go to http://barium-rbh/csiro/api/histories

I can see:
{ "id": "ba03619785539f8c", "name": "colin.nii.gz - extract", "url": "/csiro/api/histories/ba03619785539f8c" },

However when I go to:
http://barium-rbh/csiro/api/histories/ba03619785539f8c

I get:
{ "contents_url": "/csiro/api/histories/ba03619785539f8c/contents", "id": "ba03619785539f8c", "name": "colin.nii.gz - extract", "state": "error", "state_details": { "discarded": 0, "empty": 0, "error": 1, "failed_metadata": 0, "new": 0, "ok": 0, "queued": 0, "running": 0, "setting_metadata": 0, "upload": 0 } }

Any ideas much appreciated

Thanks
Neil