http put to upload large file
hi, i have problem to upload file using HTTP PUT with large file 800MB. i use 'upload', which is alright for small file size. may be some one can enlighten me with regard to my issue? juniarto halayudha
On 05/08/11 03:42, halayudha wrote:
hi,
i have problem to upload file using HTTP PUT with large file 800MB. i use 'upload', which is alright for small file size. may be some one can enlighten me with regard to my issue?
Are you getting "out of memory" errors? At a guess, plack will be building up a copy of the whole PUT request before passing it to dancer. That might end up using multiples of the original file size. Hmm - if it is a memory problem, you're not alone in struggling with it. Here's a chap who tries various Ruby+Rails based solutions before ending up with an apace java servlet. http://www.jedi.be/blog/2009/04/10/java-servlets-and-large-large-file-upload... It's possible one of the plack handlers will help with this, but a little bit of googling didn't show me anything obvious. -- Richard Huxton Archonet Ltd
On Friday 05 August 2011 03:42:20 halayudha wrote:
hi,
i have problem to upload file using HTTP PUT with large file 800MB.
800MB via a HTTP PUT? That seems... over the top. Personally, I'd look at better upload mechanisms, like a protocol designed for transferring files (SFTP, SCP, et al) I suspect that Richard is correct, it's likely exhausting memory on the server because of the whole HTTP request body being kept in RAM. Cheers Dave P -- David Precious <davidp@preshweb.co.uk> (bigpresh) http://www.preshweb.co.uk/ "Programming is like sex. One mistake and you have to support it for the rest of your life". (Michael Sinz)
yes, i have memory issue, since http->body is kept in RAM. but i wonder, how come the POST method does not have such issue? On Fri, Aug 5, 2011 at 5:38 PM, David Precious <davidp@preshweb.co.uk>wrote:
On Friday 05 August 2011 03:42:20 halayudha wrote:
hi,
i have problem to upload file using HTTP PUT with large file 800MB.
800MB via a HTTP PUT? That seems... over the top.
Personally, I'd look at better upload mechanisms, like a protocol designed for transferring files (SFTP, SCP, et al)
I suspect that Richard is correct, it's likely exhausting memory on the server because of the whole HTTP request body being kept in RAM.
Cheers
Dave P
-- David Precious <davidp@preshweb.co.uk> (bigpresh) http://www.preshweb.co.uk/
"Programming is like sex. One mistake and you have to support it for the rest of your life". (Michael Sinz)
On 05/08/11 10:49, halayudha wrote:
yes, i have memory issue, since http->body is kept in RAM. but i wonder, how come the POST method does not have such issue?
It does (well, it does here). A simple test trying to upload a .iso I had lying around showed the simple backend going past 500MB RAM usage and then falling over with "Out of memory!". -- Richard Huxton Archonet Ltd
i use the following method to POST large file. i think the "multipart/form-data" type, make it possible to transfer the large file. maybe because of the large file is transferred in CHUNKED. * <form method="POST" enctype='multipart/form-data' action="upload.cgi"> <input type=file name=upload> <input type=submit name=press value="OK"> </form>* i scripted the above mentioned form using CURL. *curl --form upload=@localfilename --form press=OK [URL]* I even try to three concurrent clients to upload large files ( 3 x 600MB). No memory error. post '/upload_file' => sub{ *my $allupload = request->upload('filename'); my $FILENAME = $allupload->filename; debug "FILENAME:".$FILENAME; my $CONTAINERPATH = params->{'containerPath'}; debug "CONTAINER: ". $CONTAINERPATH; my $UPLOAD_DIR = setting("REPOSITORY").$CONTAINERPATH.$FILENAME; #CONTAINERPATH FORMAT : /Container1/ debug "UPLOAD_DIR: ". $UPLOAD_DIR; $allupload->copy_to($UPLOAD_DIR);* On Fri, Aug 5, 2011 at 6:05 PM, Richard Huxton <dev@archonet.com> wrote:
On 05/08/11 10:49, halayudha wrote:
yes, i have memory issue, since http->body is kept in RAM. but i wonder, how come the POST method does not have such issue?
It does (well, it does here). A simple test trying to upload a .iso I had lying around showed the simple backend going past 500MB RAM usage and then falling over with "Out of memory!".
-- Richard Huxton Archonet Ltd
OK. resend. accidentally sending incomplete part. this is the complete version. i use the following method to POST large file. i think the
"multipart/form-data" type, make it possible to transfer the large file. maybe because of the large file is transferred in CHUNKED.
* <form method="POST" enctype='multipart/form-data' action="upload.cgi"> <input type=file name=upload> <input type=submit name=press value="OK"> </form>*
i scripted the above mentioned form using CURL.
*curl --form upload=@localfilename --form press=OK [URL]*
I even try to three concurrent clients to upload large files ( 3 x 600MB). No memory error.
post '/upload_file' => sub{ * my $allupload = request->upload('filename'); my $FILENAME = $allupload->filename; debug "FILENAME:".$FILENAME;
my $CONTAINERPATH = params->{'containerPath'}; debug "CONTAINER: ". $CONTAINERPATH; my $UPLOAD_DIR = setting("REPOSITORY").$CONTAINERPATH.$FILENAME;
#CONTAINERPATH FORMAT : /Container1/ debug "UPLOAD_DIR: ". $UPLOAD_DIR; $allupload->copy_to($UPLOAD_DIR);*
};
But PUT definitely not working even for one client. because DANCER require to put the whole complete BODY of HTTP request in a VARIABLE.[see Dancer Upload part] So i am thinking whether it is possible to do "multipart" using PUT.
On Fri, Aug 5, 2011 at 6:05 PM, Richard Huxton <dev@archonet.com> wrote:
On 05/08/11 10:49, halayudha wrote:
yes, i have memory issue, since http->body is kept in RAM. but i wonder, how come the POST method does not have such issue?
It does (well, it does here). A simple test trying to upload a .iso I had lying around showed the simple backend going past 500MB RAM usage and then falling over with "Out of memory!".
-- Richard Huxton Archonet Ltd
On 05/08/11 12:12, halayudha wrote:
But PUT definitely not working even for one client. because DANCER require to put the whole complete BODY of HTTP request in a VARIABLE.[see Dancer Upload part]
So i am thinking whether it is possible to do "multipart" using PUT.
Ah, I hadn't realised the implications of using PUT rather than POST. You still might be better off handling the uploads with a separate script/module. That way you can have persistent backends handling normal requests and one or more separate long-running upload backends, presumably returning a redirect or some such on completion. -- Richard Huxton Archonet Ltd
On 08/05/2011 11:38 AM, David Precious wrote:
On Friday 05 August 2011 03:42:20 halayudha wrote:
hi,
i have problem to upload file using HTTP PUT with large file 800MB.
800MB via a HTTP PUT? That seems... over the top.
Personally, I'd look at better upload mechanisms, like a protocol designed for transferring files (SFTP, SCP, et al)
Right, but clients asking for transferring huge files with the browser all the same. Regards Racke -- LinuXia Systems => http://www.linuxia.de/ Expert Interchange Consulting and System Administration ICDEVGROUP => http://www.icdevgroup.org/ Interchange Development Team
On Fri, Aug 5, 2011 at 1:21 PM, Stefan Hornburg (Racke) <racke@linuxia.de>wrote:
On 08/05/2011 11:38 AM, David Precious wrote:
On Friday 05 August 2011 03:42:20 halayudha wrote:
hi,
i have problem to upload file using HTTP PUT with large file 800MB.
800MB via a HTTP PUT? That seems... over the top.
Personally, I'd look at better upload mechanisms, like a protocol designed for transferring files (SFTP, SCP, et al)
Right, but clients asking for transferring huge files with the browser all the same.
First of all, the fact is that transferring huge files over HTTP is not going to work. And if you make it work, you've only done so by stretching an unsuitable technology to its brim and jumping through hoops. So, technically this is the wrong solution. Not much discussion regarding that, IMHO. The question you're raising is different: "I need to transfer huge files online and I need to use a web interface for the client. What should I do?" in which case the answer is very simple. Write a method that returns right away but runs upload code in async that uses FTP or some other well-formed file uploading protocol." Very simple. While clients should get what they want almost all the time, they don't understand the technology and since you do, you're in charge of making the correct usage of it. "No, I can't make your website shoot bubbles from any random user's floppy device".
participants (5)
-
David Precious -
halayudha -
Richard Huxton -
sawyer x -
Stefan Hornburg (Racke)