[Dancer-users] http put to upload large file

halayudha halayudha at gmail.com
Fri Aug 5 13:08:54 CEST 2011

i use the following method to POST large file. i think the
"multipart/form-data" type, make it possible to transfer the large file.
maybe because of the large file is transferred in CHUNKED.

* <form method="POST" enctype='multipart/form-data' action="upload.cgi">
      <input type=file name=upload>
      <input type=submit name=press value="OK">

i scripted the above mentioned form using CURL.

*curl --form upload=@localfilename --form press=OK [URL]*

I even try to  three concurrent clients to upload large files  ( 3 x
600MB). No memory error.

post '/upload_file' => sub{
*my $allupload = request->upload('filename');
                my $FILENAME = $allupload->filename;
                debug "FILENAME:".$FILENAME;
                my $CONTAINERPATH = params->{'containerPath'};
                debug "CONTAINER: ". $CONTAINERPATH;
                my $UPLOAD_DIR = setting("REPOSITORY").$CONTAINERPATH.$FILENAME;
                #CONTAINERPATH FORMAT : /Container1/
                debug "UPLOAD_DIR: ". $UPLOAD_DIR;

On Fri, Aug 5, 2011 at 6:05 PM, Richard Huxton <dev at archonet.com> wrote:

> On 05/08/11 10:49, halayudha wrote:
>> yes, i have memory issue, since http->body is kept in RAM.
>> but i wonder, how come the POST method does not have such issue?
> It does (well, it does here). A simple test trying to upload a .iso I had
> lying around showed the simple backend going past 500MB RAM usage and then
> falling over with "Out of memory!".
> --
>  Richard Huxton
>  Archonet Ltd
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.backup-manager.org/pipermail/dancer-users/attachments/20110805/beafc80e/attachment.htm>

More information about the Dancer-users mailing list