Hello, On my website I'm providing a way for users to download a content of a table (more of less, the specifics aren't critical). Something like: ==== get '/getdata => sub { my $sth = database->prepare("SELECT * FROM MyTable;"); $sth->execute(); my $result = "" ; while (@data = $sth->fetchrow_array()) { $result .= join("\t", @data) . "\n"; } header('Content-Type' => 'text/tab-separated-values'); header('Content-Disposition' => "attachment; foo.tsv" ); return $result; } ==== But the results can be very big (>2GB) and I don't want to store them in-memory before sending (the '$result' variable in the above example). What's the recommended way to do this efficiently with Dancer ? Is there a way to do HTTP chunked-transfer encoding ?
participants (1)
-
Assaf Gordon