Brown trousers time :~
Lyle - CosmicPerl.com
perl at cosmicperl.com
Mon Oct 8 01:47:33 BST 2007
Soon I'll be embarking on the largest project I've ever undertaken.
It's going to be 10's of thousands of lines of code. It needs to be
perfect (or damn close :))
I've been doing a LOT of reading though my blue shelves of O'Reilly...
As well as hitting a lot of different forums and lists and working my
Part's of the software need to be able to withstand large volumes of
traffic... I'm talking about 100's, 1000's or even 10,000's of clicks
This all has me thoroughly bricking it :~
From what I've learned it'll have to be mod_perl handling the heavily
traffic parts of the software. Basically CGI scripts that open a
database connection, read data, then write data and redirect the browser.
From all my searching I have a few questions yet unanswered, I'm hoping
you guys can help...
I'm concerned that I'll have to quickly write some C libraries for the
heavy traffic parts, the book I've found referenced most is "Embedding
and Extending Perl", is this the best book to get? Or do you guys
recommend others? Or do you recommend other books to get along with this
What's the mod_perl equivalent in Win32? I'm guessing PerlScript in ASP,
but is that faster? I can't find any benchmarks.
Would it be best to have separate databases (all in MySQL) for different
parts of the program? So that the database tables that are heavily
accessed are totally separate from those that aren't.
Anybody got some spare underpants? (preferably not white ones)
I want everything to be as realtime as possible. But this would mean
updating several tables for each of those hits, I get the nasty feeling
that will be too slow. So would it probably be better to have a cron job
updating some tables, every 10 minutes or so, and keep the heavily
updating to a single table?
I think I'm up to speed (for now at least) with everything else.
More information about the london.pm