On Thu, Sep 20, 2012 at 11:46 AM, Assaf Gordon <gordon@cshl.edu> wrote:
Hello,
I'm planning a web site (w/ Dancer, of course :) ) that will execute shell scripts (to compute some results) and eventually return the results to the user.
The web-side is simple enough, but the shell scripts might take anywhere between 10 seconds to 10 minutes to execute.
Is there a recommended way to manage external jobs for this type of scenario ? One extreme is to use SGE/PBS and build a whole database-backed-up queuing system.
If you want to go this route, I like to use https://metacpan.org/module/Dancer::Plugin::Stomp. Installing and deploying the whole queuing system is simply a matter of running: sudo cpan POE::Component::MessageQueue sudo mq.pl -Naveed
The other is perhaps to execute the shell scripts (serialized, one after the other) and just send the results to the users by email.
But if anyone has experience with something similar, any advice will be appreciated. (This is supposed to be a short-term project, just a front-end to some unix scripts - so I prefer to keep it simple, not build a full-fledged infrastructure from scratch).
Thanks, -gordon _______________________________________________ Dancer-users mailing list Dancer-users@perldancer.org http://www.backup-manager.org/cgi-bin/listinfo/dancer-users