Best practice for variables shared between processes?
djk at tobit.co.uk
Tue Sep 21 01:01:35 BST 2010
On 20/09/10 22:15, Mark Fowler wrote:
> On 20 Sep 2010, at 17:30, Roger Burton West<roger at firedrake.org> wrote:
>> I wish to have two processes, a "producer" (which will create files) and
>> a "consumer" (which will do something with them), running
>> simultaneously. Ideally the producer would push filenames to a list as
>> it finishes producing them, while the consumer would shift them off the
>> same list (or loop-wait, if the list is empty).
> I had a better thought after my last post. Have one process create
> files and, when it's done with each file, atomically move them into a
> known directory for the second process to, um, process and delete then
> when done.
> Your second process could simply scan the directory every few seconds,
> or you could do something clever with your kernel's inotify.
No-one has suggested what the likely scale of the problem is, nor just
what is time constraints are.
What I would say is that this method is well known and, it must be said
- been abhorred by many, for decades.
I can think of certain "safety of life" critical software that has been
using this method for more than 15 years and it *still* occasionally
gets its knickers in a twist. And it ain't quick.
But is "simple", if you do it right. It is also the principle behind the
"maildir" style of (e-)mailbox, as used in packages like courier-IMAP.
More information about the london.pm