Brown trousers time :~
Gareth Harper
gareth at migcan.com
Tue Oct 9 16:12:05 BST 2007
Lyle - CosmicPerl.com wrote:
> I'm currently thinking that for the Enterprise version have a separate
> web server with FastCGI to handle all the clicks and nothing else.
> Recording the data raw to a database (this means I don't need to do a
> read before the write), then periodically taking this data and
> sorting/compressing it into another database. That way it could make
> it easier to have several machines with independent web servers and
> databases if needed in the future.
>
> Potentially C++ and FastCGI for the clicks, can't see anything being
> faster than that! Not sure how hard it would be to code though.
> Haven't done any c++ for years.
Premature optimisation being the root of all evil, write it first, in
whatever language you wish as a prototype which you plan on throwing
away, profile it, find out where it's not fast enough, then concentrate
on those parts.
> Also I'm figuring rather than opening/writing/closing a database
> connection for each click, there is a way of keeping the connection
> open so all I need do is write? Can this be easily done with Perl? Are
> there any drawbacks to doing this?
>
Apparently both mod_perl and fastcgi can deal this comfortably (I tend
to use mod_perl, but our ruby applications here use fastcgi and I'm told
that it does the same) , I've written several database intensive
applications which handle hundreds of transactions per second (multiple
inserts, updates and selects per transaction). I've also co-written a C
apache module, and believe me writing that in perl first made the whole
process a lot easier in understanding how things were going to work
(perl version was fast enough, simply used too much memory).
More information about the london.pm
mailing list