Large files in blend4j
Hi all, I'm using blend4j for some time now and I'm quite happy about it, but now I run into some OutOfMemory issues. When I try to upload large files to galaxy (I know http might be not the best option) some byte array is getting to large. A possible solution would be file chunking in jersey, is this possible in blend4j and how could I implement this feature? Thanks, Eric
Glad to hear blend4j is working well, annoyed to here it is not streaming files out by default. Based on your ideas I started some work here, but it is not right. I spent the last half an hour looking around Galaxy trying to figure out why history creation via the API was broken, not realizing it was because I called setChunkedEncodingSize on the client. https://github.com/jmchilton/blend4j/commit/22608de210f805fc192c86a836ea55c7... I have created an issue in github for this: https://github.com/jmchilton/blend4j/issues/5 Feel free to try to hack on this some more, it might actually take some non-trivial refactoring to get client.setChunkedEncodingSize(CHUNKED_ENCODING_SIZE); to be set conditionally somehow. Right now all requests are sharing the same underlying client object. I will try to work on this more as I have time. -John On Mon, Sep 9, 2013 at 10:58 AM, Eric Kuyt <eric.kuijt@wur.nl> wrote:
Hi all,
I'm using blend4j for some time now and I'm quite happy about it, but now I run into some OutOfMemory issues. When I try to upload large files to galaxy (I know http might be not the best option) some byte array is getting to large. A possible solution would be file chunking in jersey, is this possible in blend4j and how could I implement this feature?
Thanks,
Eric
participants (2)
-
Eric Kuyt
-
John Chilton